Mar 20 10:55:12 crc systemd[1]: Starting Kubernetes Kubelet... Mar 20 10:55:12 crc restorecon[4677]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:55:12 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Mar 20 10:55:13 crc restorecon[4677]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Mar 20 10:55:14 crc kubenswrapper[4772]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:55:14 crc kubenswrapper[4772]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Mar 20 10:55:14 crc kubenswrapper[4772]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:55:14 crc kubenswrapper[4772]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:55:14 crc kubenswrapper[4772]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Mar 20 10:55:14 crc kubenswrapper[4772]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.345015 4772 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355649 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355690 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355704 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355722 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355737 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355748 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355759 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355769 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355781 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355792 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355803 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355813 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355828 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355881 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355897 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355911 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355922 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355934 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355948 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355961 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355973 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.355987 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356001 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356014 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356028 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356040 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356052 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356063 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356075 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356089 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356104 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356117 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356130 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356142 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356153 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356164 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356175 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356186 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356196 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356207 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356218 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356228 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356239 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356250 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356260 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356270 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356281 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356292 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356303 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356313 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356324 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356335 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356347 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356358 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356369 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356381 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356391 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356402 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356414 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356425 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356435 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356446 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356457 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356468 4772 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356478 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356492 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356503 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356515 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356526 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356535 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.356544 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356804 4772 flags.go:64] FLAG: --address="0.0.0.0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356832 4772 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356897 4772 flags.go:64] FLAG: --anonymous-auth="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356913 4772 flags.go:64] FLAG: --application-metrics-count-limit="100" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356930 4772 flags.go:64] FLAG: --authentication-token-webhook="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356942 4772 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356958 4772 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356974 4772 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.356988 4772 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357000 4772 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357013 4772 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357025 4772 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357039 4772 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357052 4772 flags.go:64] FLAG: --cgroup-root="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357064 4772 flags.go:64] FLAG: --cgroups-per-qos="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357077 4772 flags.go:64] FLAG: --client-ca-file="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357090 4772 flags.go:64] FLAG: --cloud-config="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357104 4772 flags.go:64] FLAG: --cloud-provider="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357116 4772 flags.go:64] FLAG: --cluster-dns="[]" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357130 4772 flags.go:64] FLAG: --cluster-domain="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357141 4772 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357153 4772 flags.go:64] FLAG: --config-dir="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357166 4772 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357181 4772 flags.go:64] FLAG: --container-log-max-files="5" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357197 4772 flags.go:64] FLAG: --container-log-max-size="10Mi" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357209 4772 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357221 4772 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357234 4772 flags.go:64] FLAG: --containerd-namespace="k8s.io" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357245 4772 flags.go:64] FLAG: --contention-profiling="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357258 4772 flags.go:64] FLAG: --cpu-cfs-quota="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357269 4772 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357282 4772 flags.go:64] FLAG: --cpu-manager-policy="none" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357293 4772 flags.go:64] FLAG: --cpu-manager-policy-options="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357308 4772 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357320 4772 flags.go:64] FLAG: --enable-controller-attach-detach="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357331 4772 flags.go:64] FLAG: --enable-debugging-handlers="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357343 4772 flags.go:64] FLAG: --enable-load-reader="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357355 4772 flags.go:64] FLAG: --enable-server="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357366 4772 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357383 4772 flags.go:64] FLAG: --event-burst="100" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357395 4772 flags.go:64] FLAG: --event-qps="50" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357406 4772 flags.go:64] FLAG: --event-storage-age-limit="default=0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357418 4772 flags.go:64] FLAG: --event-storage-event-limit="default=0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357430 4772 flags.go:64] FLAG: --eviction-hard="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357446 4772 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357459 4772 flags.go:64] FLAG: --eviction-minimum-reclaim="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357470 4772 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357483 4772 flags.go:64] FLAG: --eviction-soft="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357495 4772 flags.go:64] FLAG: --eviction-soft-grace-period="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357507 4772 flags.go:64] FLAG: --exit-on-lock-contention="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357519 4772 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357530 4772 flags.go:64] FLAG: --experimental-mounter-path="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357542 4772 flags.go:64] FLAG: --fail-cgroupv1="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357555 4772 flags.go:64] FLAG: --fail-swap-on="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357567 4772 flags.go:64] FLAG: --feature-gates="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357583 4772 flags.go:64] FLAG: --file-check-frequency="20s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357595 4772 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357608 4772 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357620 4772 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357634 4772 flags.go:64] FLAG: --healthz-port="10248" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357647 4772 flags.go:64] FLAG: --help="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357659 4772 flags.go:64] FLAG: --hostname-override="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357672 4772 flags.go:64] FLAG: --housekeeping-interval="10s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357684 4772 flags.go:64] FLAG: --http-check-frequency="20s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357696 4772 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357708 4772 flags.go:64] FLAG: --image-credential-provider-config="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357719 4772 flags.go:64] FLAG: --image-gc-high-threshold="85" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357731 4772 flags.go:64] FLAG: --image-gc-low-threshold="80" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357743 4772 flags.go:64] FLAG: --image-service-endpoint="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357754 4772 flags.go:64] FLAG: --kernel-memcg-notification="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357765 4772 flags.go:64] FLAG: --kube-api-burst="100" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357778 4772 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357790 4772 flags.go:64] FLAG: --kube-api-qps="50" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357801 4772 flags.go:64] FLAG: --kube-reserved="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357813 4772 flags.go:64] FLAG: --kube-reserved-cgroup="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357824 4772 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357872 4772 flags.go:64] FLAG: --kubelet-cgroups="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357886 4772 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357898 4772 flags.go:64] FLAG: --lock-file="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357909 4772 flags.go:64] FLAG: --log-cadvisor-usage="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357923 4772 flags.go:64] FLAG: --log-flush-frequency="5s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357938 4772 flags.go:64] FLAG: --log-json-info-buffer-size="0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357958 4772 flags.go:64] FLAG: --log-json-split-stream="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357970 4772 flags.go:64] FLAG: --log-text-info-buffer-size="0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357982 4772 flags.go:64] FLAG: --log-text-split-stream="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.357993 4772 flags.go:64] FLAG: --logging-format="text" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358009 4772 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358023 4772 flags.go:64] FLAG: --make-iptables-util-chains="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358034 4772 flags.go:64] FLAG: --manifest-url="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358047 4772 flags.go:64] FLAG: --manifest-url-header="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358065 4772 flags.go:64] FLAG: --max-housekeeping-interval="15s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358078 4772 flags.go:64] FLAG: --max-open-files="1000000" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358095 4772 flags.go:64] FLAG: --max-pods="110" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358108 4772 flags.go:64] FLAG: --maximum-dead-containers="-1" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358119 4772 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358133 4772 flags.go:64] FLAG: --memory-manager-policy="None" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358144 4772 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358157 4772 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358169 4772 flags.go:64] FLAG: --node-ip="192.168.126.11" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358180 4772 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358241 4772 flags.go:64] FLAG: --node-status-max-images="50" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358253 4772 flags.go:64] FLAG: --node-status-update-frequency="10s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358265 4772 flags.go:64] FLAG: --oom-score-adj="-999" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358277 4772 flags.go:64] FLAG: --pod-cidr="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358289 4772 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358306 4772 flags.go:64] FLAG: --pod-manifest-path="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358317 4772 flags.go:64] FLAG: --pod-max-pids="-1" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358329 4772 flags.go:64] FLAG: --pods-per-core="0" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358341 4772 flags.go:64] FLAG: --port="10250" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358353 4772 flags.go:64] FLAG: --protect-kernel-defaults="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358364 4772 flags.go:64] FLAG: --provider-id="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358375 4772 flags.go:64] FLAG: --qos-reserved="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358387 4772 flags.go:64] FLAG: --read-only-port="10255" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358402 4772 flags.go:64] FLAG: --register-node="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358414 4772 flags.go:64] FLAG: --register-schedulable="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358426 4772 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358487 4772 flags.go:64] FLAG: --registry-burst="10" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358502 4772 flags.go:64] FLAG: --registry-qps="5" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358514 4772 flags.go:64] FLAG: --reserved-cpus="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358525 4772 flags.go:64] FLAG: --reserved-memory="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358541 4772 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358553 4772 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358566 4772 flags.go:64] FLAG: --rotate-certificates="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358578 4772 flags.go:64] FLAG: --rotate-server-certificates="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358590 4772 flags.go:64] FLAG: --runonce="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358601 4772 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358613 4772 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358625 4772 flags.go:64] FLAG: --seccomp-default="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358637 4772 flags.go:64] FLAG: --serialize-image-pulls="true" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358649 4772 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358661 4772 flags.go:64] FLAG: --storage-driver-db="cadvisor" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358675 4772 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358687 4772 flags.go:64] FLAG: --storage-driver-password="root" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358699 4772 flags.go:64] FLAG: --storage-driver-secure="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358711 4772 flags.go:64] FLAG: --storage-driver-table="stats" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358723 4772 flags.go:64] FLAG: --storage-driver-user="root" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358734 4772 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358747 4772 flags.go:64] FLAG: --sync-frequency="1m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358759 4772 flags.go:64] FLAG: --system-cgroups="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358770 4772 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358788 4772 flags.go:64] FLAG: --system-reserved-cgroup="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358800 4772 flags.go:64] FLAG: --tls-cert-file="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358811 4772 flags.go:64] FLAG: --tls-cipher-suites="[]" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358826 4772 flags.go:64] FLAG: --tls-min-version="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358871 4772 flags.go:64] FLAG: --tls-private-key-file="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358886 4772 flags.go:64] FLAG: --topology-manager-policy="none" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358898 4772 flags.go:64] FLAG: --topology-manager-policy-options="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358910 4772 flags.go:64] FLAG: --topology-manager-scope="container" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358922 4772 flags.go:64] FLAG: --v="2" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358939 4772 flags.go:64] FLAG: --version="false" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358955 4772 flags.go:64] FLAG: --vmodule="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358969 4772 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.358981 4772 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359260 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359277 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359289 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359300 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359311 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359324 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359339 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359351 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359364 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359377 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359389 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359400 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359411 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359422 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359438 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359451 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359463 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359474 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359484 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359495 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359507 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359519 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359531 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359543 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359556 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359568 4772 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359581 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359591 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359603 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359613 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359623 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359633 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359644 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359654 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359664 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359674 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359685 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359695 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359705 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359715 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359725 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359735 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359748 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359758 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359769 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359779 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359788 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359798 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359809 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359819 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359831 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359878 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359889 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359900 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359911 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359921 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359933 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359943 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359955 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359965 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359979 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359989 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.359999 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360009 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360024 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360036 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360047 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360059 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360070 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360081 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.360092 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.360126 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.378628 4772 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.378711 4772 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378910 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378929 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378942 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378954 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378964 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378976 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.378991 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379000 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379008 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379016 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379024 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379033 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379042 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379050 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379058 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379068 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379076 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379084 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379091 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379099 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379107 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379116 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379126 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379135 4772 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379147 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379158 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379167 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379178 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379188 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379196 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379204 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379213 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379221 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379231 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379239 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379247 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379254 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379262 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379270 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379278 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379286 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379293 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379301 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379312 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379324 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379334 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379348 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379362 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379373 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379385 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379395 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379407 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379417 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379428 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379439 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379451 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379461 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379470 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379481 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379492 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379503 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379517 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379531 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379541 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379553 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379564 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379575 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379585 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379595 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379603 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379612 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.379627 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379948 4772 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379969 4772 feature_gate.go:330] unrecognized feature gate: Example Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379978 4772 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379987 4772 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.379995 4772 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380004 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380015 4772 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380027 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380039 4772 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380050 4772 feature_gate.go:330] unrecognized feature gate: PlatformOperators Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380059 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380069 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380077 4772 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380086 4772 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380095 4772 feature_gate.go:330] unrecognized feature gate: GatewayAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380104 4772 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380112 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380122 4772 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380131 4772 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380140 4772 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380148 4772 feature_gate.go:330] unrecognized feature gate: PinnedImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380156 4772 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380165 4772 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380176 4772 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380186 4772 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380195 4772 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380204 4772 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380213 4772 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380220 4772 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380228 4772 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380236 4772 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380244 4772 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380252 4772 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380260 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380268 4772 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380276 4772 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380284 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380295 4772 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380304 4772 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380311 4772 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380321 4772 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380330 4772 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380337 4772 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380346 4772 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380354 4772 feature_gate.go:330] unrecognized feature gate: NewOLM Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380361 4772 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380369 4772 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380377 4772 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380384 4772 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380392 4772 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380400 4772 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380407 4772 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380415 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380423 4772 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380431 4772 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380442 4772 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380452 4772 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380462 4772 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380471 4772 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380479 4772 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380487 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380496 4772 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380504 4772 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380512 4772 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380521 4772 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380529 4772 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380537 4772 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380545 4772 feature_gate.go:330] unrecognized feature gate: OVNObservability Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380553 4772 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380562 4772 feature_gate.go:330] unrecognized feature gate: SignatureStores Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.380569 4772 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.380583 4772 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.380979 4772 server.go:940] "Client rotation is on, will bootstrap in background" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.387134 4772 bootstrap.go:266] "Unhandled Error" err="part of the existing bootstrap client certificate in /var/lib/kubelet/kubeconfig is expired: 2026-02-24 05:52:08 +0000 UTC" logger="UnhandledError" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.392782 4772 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.392983 4772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.396291 4772 server.go:997] "Starting client certificate rotation" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.396341 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.396565 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.423593 4772 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.427178 4772 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.428767 4772 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.454195 4772 log.go:25] "Validated CRI v1 runtime API" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.504575 4772 log.go:25] "Validated CRI v1 image API" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.507578 4772 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.513317 4772 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-03-20-10-48-51-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.513369 4772 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:42 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:43 fsType:tmpfs blockSize:0}] Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.542433 4772 manager.go:217] Machine: {Timestamp:2026-03-20 10:55:14.539076448 +0000 UTC m=+0.630043003 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654128640 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:6fe7d706-49bf-443a-bc98-4f48ecaccc59 BootID:339e6ef0-bc07-454a-8bb1-f97440045fc1 Filesystems:[{Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:42 Capacity:3365412864 Type:vfs Inodes:821634 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:43 Capacity:1073741824 Type:vfs Inodes:4108170 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827064320 Type:vfs Inodes:4108170 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:0f:fc:1c Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:0f:fc:1c Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:73:fb:58 Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:6a:35:03 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:75:39:ac Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:9f:83:75 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:86:dd:81:0d:6b:35 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:1e:6b:1b:e6:9e:26 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654128640 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.542991 4772 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.543244 4772 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.545020 4772 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.545514 4772 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.545579 4772 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.546109 4772 topology_manager.go:138] "Creating topology manager with none policy" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.546135 4772 container_manager_linux.go:303] "Creating device plugin manager" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.546755 4772 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.546818 4772 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.548032 4772 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.548214 4772 server.go:1245] "Using root directory" path="/var/lib/kubelet" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.552423 4772 kubelet.go:418] "Attempting to sync node with API server" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.552464 4772 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.552510 4772 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.552535 4772 kubelet.go:324] "Adding apiserver pod source" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.552555 4772 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.557687 4772 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.558645 4772 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.558684 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.558874 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.558923 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.559025 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.561537 4772 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563417 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563443 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563452 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563460 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563473 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563480 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563489 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563500 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563509 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563519 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563531 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.563539 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.564431 4772 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.564949 4772 server.go:1280] "Started kubelet" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.565917 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.565968 4772 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.565941 4772 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.567297 4772 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Mar 20 10:55:14 crc systemd[1]: Started Kubernetes Kubelet. Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.569398 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.569487 4772 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.569690 4772 volume_manager.go:287] "The desired_state_of_world populator starts" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.569821 4772 volume_manager.go:289] "Starting Kubelet Volume Manager" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.570048 4772 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.569701 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.570433 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.570523 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.570999 4772 factory.go:55] Registering systemd factory Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.571036 4772 factory.go:221] Registration of the systemd container factory successfully Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.571550 4772 factory.go:153] Registering CRI-O factory Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.571605 4772 factory.go:221] Registration of the crio container factory successfully Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.571625 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="200ms" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.571754 4772 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.571769 4772 server.go:460] "Adding debug handlers to kubelet server" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.571796 4772 factory.go:103] Registering Raw factory Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.572171 4772 manager.go:1196] Started watching for new ooms in manager Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.579472 4772 manager.go:319] Starting recovery of all containers Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.587425 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8757d6b9133a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,LastTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.593059 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.593377 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.593525 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.593654 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.593769 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.593908 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594026 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594147 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594298 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594435 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594595 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594718 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594864 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.594997 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595117 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595229 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595336 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595418 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595502 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595584 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595663 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595748 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.595893 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596018 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596108 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596191 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596287 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596398 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596485 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596572 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596654 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596739 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596872 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.596963 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597058 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597144 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597223 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597303 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597392 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597676 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597786 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597901 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.597983 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598061 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598137 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598229 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598334 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598420 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598511 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598592 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598700 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.598830 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599040 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599139 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599225 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599324 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599408 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599499 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599580 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599660 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599767 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.599913 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600045 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600189 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600300 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600386 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600488 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600581 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600662 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600736 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600816 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.600937 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601021 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601113 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601196 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601301 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601404 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601503 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601585 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601668 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601751 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601866 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.601964 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602046 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602125 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602204 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602314 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602407 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602486 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602561 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602641 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602717 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602818 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.602923 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603007 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603130 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603215 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603301 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603390 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603490 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603566 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.603642 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606029 4772 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606163 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606290 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606387 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606488 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606572 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606658 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606737 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606813 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606914 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.606995 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607092 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607176 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607254 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607328 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607414 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607492 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607574 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607649 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607725 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607800 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.607902 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608001 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608082 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608157 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608232 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608308 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608390 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608471 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608546 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608619 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608726 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608809 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.608919 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609002 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609079 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609152 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609230 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609313 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609387 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609460 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609537 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609627 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609712 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609795 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.609980 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610060 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610137 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610217 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610325 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610409 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610492 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610567 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610646 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610733 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610812 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610915 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.610999 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611078 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611172 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611248 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611324 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611400 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611475 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611557 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611668 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611747 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611822 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.611921 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612015 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612097 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612173 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612247 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612331 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612412 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612495 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612571 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612644 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612722 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612798 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.612902 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613003 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613742 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613808 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613852 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613875 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613897 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613917 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613937 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613959 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.613989 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614012 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614033 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614056 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614076 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614104 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614124 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614145 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614165 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614183 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614202 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614221 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614238 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614301 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614322 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614344 4772 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614365 4772 reconstruct.go:97] "Volume reconstruction finished" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.614378 4772 reconciler.go:26] "Reconciler: start to sync state" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.615252 4772 manager.go:324] Recovery completed Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.631068 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.632601 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.632667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.632687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.634155 4772 cpu_manager.go:225] "Starting CPU manager" policy="none" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.634209 4772 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.634284 4772 state_mem.go:36] "Initialized new in-memory state store" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.638957 4772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.640524 4772 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.640589 4772 status_manager.go:217] "Starting to sync pod status with apiserver" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.640635 4772 kubelet.go:2335] "Starting kubelet main sync loop" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.640724 4772 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Mar 20 10:55:14 crc kubenswrapper[4772]: W0320 10:55:14.641323 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.641440 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.658387 4772 policy_none.go:49] "None policy: Start" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.660032 4772 memory_manager.go:170] "Starting memorymanager" policy="None" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.660072 4772 state_mem.go:35] "Initializing new in-memory state store" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.674480 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.735166 4772 manager.go:334] "Starting Device Plugin manager" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.735496 4772 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.735513 4772 server.go:79] "Starting device plugin registration server" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.736057 4772 eviction_manager.go:189] "Eviction manager: starting control loop" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.736077 4772 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.736346 4772 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.736442 4772 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.736451 4772 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.740996 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.741143 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.743107 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.743152 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.743162 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.743375 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.744046 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.744088 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.745226 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.745243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.745255 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.746701 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.746745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.746758 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.746978 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.747192 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.747253 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.747933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.747965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.747982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.748080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.748106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.748124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.748142 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.748372 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.748457 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.750634 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.772979 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="400ms" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796191 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796218 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796617 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796668 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.796686 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.798020 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.798051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.798060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.798269 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.798294 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.799121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.799143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.799176 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.799669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.799706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.799719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816436 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816495 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816530 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816559 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816721 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816753 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816806 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816835 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816897 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.816919 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.836886 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.837854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.837892 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.837905 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.837928 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:14 crc kubenswrapper[4772]: E0320 10:55:14.838381 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918034 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918335 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918808 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918876 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918910 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918939 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918970 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.918993 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919028 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919081 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919106 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919134 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919165 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919193 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919446 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919576 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919591 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919624 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919672 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919653 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919691 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919605 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919732 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919802 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Mar 20 10:55:14 crc kubenswrapper[4772]: I0320 10:55:14.919803 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.020982 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.021057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.021237 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.021398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.038983 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.041000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.041054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.041068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.041107 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.041655 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.121717 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.130616 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.161033 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.170675 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-3a8775ff8ab0a7ac35d4b070efdc876273f5b492ed9b55859b49d99785eb6e43 WatchSource:0}: Error finding container 3a8775ff8ab0a7ac35d4b070efdc876273f5b492ed9b55859b49d99785eb6e43: Status 404 returned error can't find the container with id 3a8775ff8ab0a7ac35d4b070efdc876273f5b492ed9b55859b49d99785eb6e43 Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.174517 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="800ms" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.175663 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.179151 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2139d3e2895fc6797b9c76a1b4c9886d.slice/crio-72f6528030dce2b5de601e07e9c4d41396a64d1a2dbd2c75b92ecaffe35482d5 WatchSource:0}: Error finding container 72f6528030dce2b5de601e07e9c4d41396a64d1a2dbd2c75b92ecaffe35482d5: Status 404 returned error can't find the container with id 72f6528030dce2b5de601e07e9c4d41396a64d1a2dbd2c75b92ecaffe35482d5 Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.181815 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.208229 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-eaf74044018f0a1e3b835441890ebe767b26830d83ab036284e1f72f50cd6619 WatchSource:0}: Error finding container eaf74044018f0a1e3b835441890ebe767b26830d83ab036284e1f72f50cd6619: Status 404 returned error can't find the container with id eaf74044018f0a1e3b835441890ebe767b26830d83ab036284e1f72f50cd6619 Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.210360 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-420a5cb332c24b2fa979a5ecc57200b1991db6f71275652d0b4c3ccfc95cff5a WatchSource:0}: Error finding container 420a5cb332c24b2fa979a5ecc57200b1991db6f71275652d0b4c3ccfc95cff5a: Status 404 returned error can't find the container with id 420a5cb332c24b2fa979a5ecc57200b1991db6f71275652d0b4c3ccfc95cff5a Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.213319 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a43fd1a7e100af38e6617c5010fe93661e05a19c4802d34852d5e60d68652842 WatchSource:0}: Error finding container a43fd1a7e100af38e6617c5010fe93661e05a19c4802d34852d5e60d68652842: Status 404 returned error can't find the container with id a43fd1a7e100af38e6617c5010fe93661e05a19c4802d34852d5e60d68652842 Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.377146 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.377234 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.442500 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.443870 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.443913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.443922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.443947 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.444267 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.567865 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.603965 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.604087 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.630491 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.630648 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.646615 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3a8775ff8ab0a7ac35d4b070efdc876273f5b492ed9b55859b49d99785eb6e43"} Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.647876 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a43fd1a7e100af38e6617c5010fe93661e05a19c4802d34852d5e60d68652842"} Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.653197 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eaf74044018f0a1e3b835441890ebe767b26830d83ab036284e1f72f50cd6619"} Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.656995 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"420a5cb332c24b2fa979a5ecc57200b1991db6f71275652d0b4c3ccfc95cff5a"} Mar 20 10:55:15 crc kubenswrapper[4772]: I0320 10:55:15.658246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"72f6528030dce2b5de601e07e9c4d41396a64d1a2dbd2c75b92ecaffe35482d5"} Mar 20 10:55:15 crc kubenswrapper[4772]: W0320 10:55:15.904658 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.904793 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:15 crc kubenswrapper[4772]: E0320 10:55:15.975564 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="1.6s" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.244960 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.247113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.247220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.247284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.247331 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:16 crc kubenswrapper[4772]: E0320 10:55:16.248188 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.457629 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:16 crc kubenswrapper[4772]: E0320 10:55:16.459419 4772 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.567677 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.665959 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.666019 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.666037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.670494 4772 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956" exitCode=0 Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.670589 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.670715 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.673327 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e" exitCode=0 Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.673472 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.673526 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.673613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.673646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.673488 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.675027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.675077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.675095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.676820 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.677716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.677761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.677774 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.680241 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42" exitCode=0 Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.680338 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.680384 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.686470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.686556 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.686584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.688194 4772 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="086343258fe79aa85d0001eadacdcb7fa41cc4be5767b69dacd3b88b158f82e1" exitCode=0 Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.688251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"086343258fe79aa85d0001eadacdcb7fa41cc4be5767b69dacd3b88b158f82e1"} Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.688377 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.689705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.689749 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:16 crc kubenswrapper[4772]: I0320 10:55:16.689770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4772]: E0320 10:55:17.174269 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.189e8757d6b9133a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,LastTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:17 crc kubenswrapper[4772]: W0320 10:55:17.174991 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:17 crc kubenswrapper[4772]: E0320 10:55:17.175104 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.567082 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:17 crc kubenswrapper[4772]: E0320 10:55:17.577023 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="3.2s" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.705971 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"3a13113e6fe1adf18d397e3df66452dd12ed0d4d2f1d7cec373d0868f7e81512"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.706535 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.707935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.707972 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.707983 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.709414 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.709521 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.710785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.710906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.710995 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.713771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"6d33fd0873f0fe72f8f380e8f57d887c6908f3dfe5d207f461855c83796e1b16"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.713801 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"97b875b88a814ccfbc5bf71292c5869f4bd7b7f005383d7ae40a6027987ce57a"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.713818 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"eb9033b2be08ce32fd65b95fa314492bd2a117445313699e62962f27e5c48356"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.713950 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.714827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.714891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.714920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.718466 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.718499 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.718515 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.718527 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.720253 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea" exitCode=0 Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.720292 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea"} Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.720430 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.721077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.721098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.721110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.848986 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.851440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.851474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.851485 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.851508 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:17 crc kubenswrapper[4772]: E0320 10:55:17.852008 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.119:6443: connect: connection refused" node="crc" Mar 20 10:55:17 crc kubenswrapper[4772]: W0320 10:55:17.968106 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:17 crc kubenswrapper[4772]: E0320 10:55:17.968239 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.985796 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:17 crc kubenswrapper[4772]: I0320 10:55:17.999901 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:18 crc kubenswrapper[4772]: W0320 10:55:18.104202 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.119:6443: connect: connection refused Mar 20 10:55:18 crc kubenswrapper[4772]: E0320 10:55:18.104331 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.119:6443: connect: connection refused" logger="UnhandledError" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.537769 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.727242 4772 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53" exitCode=0 Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.727322 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53"} Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.727458 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.729312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.729374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.729393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734022 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"866ea67ab7ed52332d4e96e92873e663fdbd4eafa50ed26dd9a51897484399a5"} Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734101 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734218 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734272 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734226 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734364 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.734397 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.735930 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.735987 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736004 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736055 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736101 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736119 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.736163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.737303 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.737356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:18 crc kubenswrapper[4772]: I0320 10:55:18.737377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745190 4772 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745248 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745291 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b"} Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745365 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a"} Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745384 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb"} Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745384 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.745815 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.746320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.746352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.746366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.746949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.747009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.747031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.747579 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.747634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.747665 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:19 crc kubenswrapper[4772]: I0320 10:55:19.771225 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.514910 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.756623 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.756924 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9"} Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.756978 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4"} Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.757024 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.757887 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.757921 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.757935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.758084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.758113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.758126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:20 crc kubenswrapper[4772]: I0320 10:55:20.762629 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.052888 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.054340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.054445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.054515 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.054586 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.538203 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.538353 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.758207 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.758966 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.758994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:21 crc kubenswrapper[4772]: I0320 10:55:21.759003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.383228 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.383449 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.384903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.384946 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.384962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.751957 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.760797 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.760797 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.761659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.761766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.761828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.762223 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.762266 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.762282 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:22 crc kubenswrapper[4772]: I0320 10:55:22.863293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Mar 20 10:55:23 crc kubenswrapper[4772]: I0320 10:55:23.765100 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:23 crc kubenswrapper[4772]: I0320 10:55:23.766198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:23 crc kubenswrapper[4772]: I0320 10:55:23.766250 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:23 crc kubenswrapper[4772]: I0320 10:55:23.766271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:24 crc kubenswrapper[4772]: E0320 10:55:24.751123 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:26 crc kubenswrapper[4772]: I0320 10:55:26.980307 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:26 crc kubenswrapper[4772]: I0320 10:55:26.980480 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:26 crc kubenswrapper[4772]: I0320 10:55:26.982016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:26 crc kubenswrapper[4772]: I0320 10:55:26.982046 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:26 crc kubenswrapper[4772]: I0320 10:55:26.982056 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:27 crc kubenswrapper[4772]: I0320 10:55:27.250005 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:27 crc kubenswrapper[4772]: I0320 10:55:27.777718 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:27 crc kubenswrapper[4772]: I0320 10:55:27.780509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:27 crc kubenswrapper[4772]: I0320 10:55:27.780594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:27 crc kubenswrapper[4772]: I0320 10:55:27.780617 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:28 crc kubenswrapper[4772]: W0320 10:55:28.380349 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Mar 20 10:55:28 crc kubenswrapper[4772]: I0320 10:55:28.380491 4772 trace.go:236] Trace[228747068]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (20-Mar-2026 10:55:18.378) (total time: 10001ms): Mar 20 10:55:28 crc kubenswrapper[4772]: Trace[228747068]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (10:55:28.380) Mar 20 10:55:28 crc kubenswrapper[4772]: Trace[228747068]: [10.001487186s] [10.001487186s] END Mar 20 10:55:28 crc kubenswrapper[4772]: E0320 10:55:28.380529 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Mar 20 10:55:28 crc kubenswrapper[4772]: I0320 10:55:28.569019 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Mar 20 10:55:29 crc kubenswrapper[4772]: W0320 10:55:29.501679 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.501801 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.502987 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8757d6b9133a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,LastTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:29 crc kubenswrapper[4772]: W0320 10:55:29.503587 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.503720 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.505461 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.505523 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.507748 4772 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:29 crc kubenswrapper[4772]: W0320 10:55:29.508297 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.508372 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.508787 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:29 crc kubenswrapper[4772]: E0320 10:55:29.510913 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z" interval="6.4s" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.512490 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.512553 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.571199 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:29Z is after 2026-02-23T05:33:13Z Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.772101 4772 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver-check-endpoints namespace/openshift-kube-apiserver: Readiness probe status=failure output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" start-of-body= Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.772243 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" probeResult="failure" output="Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.785790 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.788688 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="866ea67ab7ed52332d4e96e92873e663fdbd4eafa50ed26dd9a51897484399a5" exitCode=255 Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.788756 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"866ea67ab7ed52332d4e96e92873e663fdbd4eafa50ed26dd9a51897484399a5"} Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.789060 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.790092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.790150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.790169 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:29 crc kubenswrapper[4772]: I0320 10:55:29.791215 4772 scope.go:117] "RemoveContainer" containerID="866ea67ab7ed52332d4e96e92873e663fdbd4eafa50ed26dd9a51897484399a5" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.583395 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:30Z is after 2026-02-23T05:33:13Z Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.792270 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.792515 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.793752 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.793814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.793832 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.794386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.796826 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d"} Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.797022 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.797877 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.797907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.797918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:30 crc kubenswrapper[4772]: I0320 10:55:30.809780 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.539111 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.539256 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.572188 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:31Z is after 2026-02-23T05:33:13Z Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.801946 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.803114 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/0.log" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.804813 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" exitCode=255 Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.804922 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d"} Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.805013 4772 scope.go:117] "RemoveContainer" containerID="866ea67ab7ed52332d4e96e92873e663fdbd4eafa50ed26dd9a51897484399a5" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.805532 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.806948 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.808507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.808586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.808611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.809502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.810106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.810260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:31 crc kubenswrapper[4772]: I0320 10:55:31.812229 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:31 crc kubenswrapper[4772]: E0320 10:55:31.812564 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:32 crc kubenswrapper[4772]: W0320 10:55:32.010291 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2026-02-23T05:33:13Z Mar 20 10:55:32 crc kubenswrapper[4772]: E0320 10:55:32.010386 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.389782 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.572535 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:32Z is after 2026-02-23T05:33:13Z Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.758741 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.810635 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.813620 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.814989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.815040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.815055 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:32 crc kubenswrapper[4772]: I0320 10:55:32.816054 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:32 crc kubenswrapper[4772]: E0320 10:55:32.816307 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:33 crc kubenswrapper[4772]: I0320 10:55:33.572157 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:33Z is after 2026-02-23T05:33:13Z Mar 20 10:55:33 crc kubenswrapper[4772]: I0320 10:55:33.817699 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:33 crc kubenswrapper[4772]: I0320 10:55:33.819298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:33 crc kubenswrapper[4772]: I0320 10:55:33.819391 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:33 crc kubenswrapper[4772]: I0320 10:55:33.819413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:33 crc kubenswrapper[4772]: I0320 10:55:33.820653 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:33 crc kubenswrapper[4772]: E0320 10:55:33.821048 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.570779 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:34Z is after 2026-02-23T05:33:13Z Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.749246 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:34 crc kubenswrapper[4772]: E0320 10:55:34.751345 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.820692 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.822466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.822529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.822550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:34 crc kubenswrapper[4772]: I0320 10:55:34.823596 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:34 crc kubenswrapper[4772]: E0320 10:55:34.823958 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:35 crc kubenswrapper[4772]: I0320 10:55:35.572068 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2026-02-23T05:33:13Z Mar 20 10:55:35 crc kubenswrapper[4772]: I0320 10:55:35.909983 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:35 crc kubenswrapper[4772]: I0320 10:55:35.912165 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:35 crc kubenswrapper[4772]: I0320 10:55:35.912227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:35 crc kubenswrapper[4772]: I0320 10:55:35.912251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:35 crc kubenswrapper[4772]: I0320 10:55:35.912305 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:35 crc kubenswrapper[4772]: E0320 10:55:35.918078 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:35 crc kubenswrapper[4772]: E0320 10:55:35.921778 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:35Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:36 crc kubenswrapper[4772]: I0320 10:55:36.571925 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2026-02-23T05:33:13Z Mar 20 10:55:36 crc kubenswrapper[4772]: W0320 10:55:36.731293 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2026-02-23T05:33:13Z Mar 20 10:55:36 crc kubenswrapper[4772]: E0320 10:55:36.731439 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:36Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:37 crc kubenswrapper[4772]: I0320 10:55:37.572990 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:37Z is after 2026-02-23T05:33:13Z Mar 20 10:55:38 crc kubenswrapper[4772]: I0320 10:55:38.201526 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:38 crc kubenswrapper[4772]: E0320 10:55:38.207885 4772 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:38 crc kubenswrapper[4772]: I0320 10:55:38.571601 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:38Z is after 2026-02-23T05:33:13Z Mar 20 10:55:38 crc kubenswrapper[4772]: W0320 10:55:38.928639 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:38Z is after 2026-02-23T05:33:13Z Mar 20 10:55:38 crc kubenswrapper[4772]: E0320 10:55:38.928763 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:38Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:39 crc kubenswrapper[4772]: E0320 10:55:39.509574 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:39Z is after 2026-02-23T05:33:13Z" event="&Event{ObjectMeta:{crc.189e8757d6b9133a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,LastTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.570342 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:39Z is after 2026-02-23T05:33:13Z Mar 20 10:55:39 crc kubenswrapper[4772]: W0320 10:55:39.698067 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:39Z is after 2026-02-23T05:33:13Z Mar 20 10:55:39 crc kubenswrapper[4772]: E0320 10:55:39.698177 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:39Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.771957 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.772272 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.773801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.773931 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.773961 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:39 crc kubenswrapper[4772]: I0320 10:55:39.774919 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:39 crc kubenswrapper[4772]: E0320 10:55:39.775226 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:40 crc kubenswrapper[4772]: I0320 10:55:40.572942 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:40Z is after 2026-02-23T05:33:13Z Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.570290 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z Mar 20 10:55:41 crc kubenswrapper[4772]: W0320 10:55:41.601138 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z Mar 20 10:55:41 crc kubenswrapper[4772]: E0320 10:55:41.601253 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z" logger="UnhandledError" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.919593 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.919654 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.919697 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.919828 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.921370 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.921396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.921404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.921829 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="cluster-policy-controller" containerStatusID={"Type":"cri-o","ID":"38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container cluster-policy-controller failed startup probe, will be restarted" Mar 20 10:55:41 crc kubenswrapper[4772]: I0320 10:55:41.921981 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" containerID="cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957" gracePeriod=30 Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.571684 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2026-02-23T05:33:13Z Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.849159 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.849794 4772 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957" exitCode=255 Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.849898 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957"} Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.849957 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825"} Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.850105 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.851346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.851415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.851435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.918465 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.920393 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.920474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.920502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:42 crc kubenswrapper[4772]: I0320 10:55:42.920555 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:42 crc kubenswrapper[4772]: E0320 10:55:42.926772 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2026-02-23T05:33:13Z" node="crc" Mar 20 10:55:42 crc kubenswrapper[4772]: E0320 10:55:42.930634 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:42Z is after 2026-02-23T05:33:13Z" interval="7s" Mar 20 10:55:43 crc kubenswrapper[4772]: I0320 10:55:43.571739 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:43Z is after 2026-02-23T05:33:13Z Mar 20 10:55:44 crc kubenswrapper[4772]: I0320 10:55:44.571949 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:44Z is after 2026-02-23T05:33:13Z Mar 20 10:55:44 crc kubenswrapper[4772]: E0320 10:55:44.751516 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:45 crc kubenswrapper[4772]: I0320 10:55:45.571556 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:45Z is after 2026-02-23T05:33:13Z Mar 20 10:55:46 crc kubenswrapper[4772]: I0320 10:55:46.574403 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:47 crc kubenswrapper[4772]: I0320 10:55:47.249907 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:47 crc kubenswrapper[4772]: I0320 10:55:47.250079 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:47 crc kubenswrapper[4772]: I0320 10:55:47.251071 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:47 crc kubenswrapper[4772]: I0320 10:55:47.251109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:47 crc kubenswrapper[4772]: I0320 10:55:47.251120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:47 crc kubenswrapper[4772]: I0320 10:55:47.573472 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:48 crc kubenswrapper[4772]: I0320 10:55:48.538439 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:48 crc kubenswrapper[4772]: I0320 10:55:48.538702 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:48 crc kubenswrapper[4772]: I0320 10:55:48.540329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:48 crc kubenswrapper[4772]: I0320 10:55:48.540382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:48 crc kubenswrapper[4772]: I0320 10:55:48.540405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:48 crc kubenswrapper[4772]: I0320 10:55:48.568459 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.519472 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757d6b9133a default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,LastTimestamp:2026-03-20 10:55:14.56491193 +0000 UTC m=+0.655878415,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.527192 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.534028 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.540614 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.546956 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757e12e9469 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeAllocatableEnforced,Message:Updated Node Allocatable limit across pods,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.740384873 +0000 UTC m=+0.831351368,LastTimestamp:2026-03-20 10:55:14.740384873 +0000 UTC m=+0.831351368,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.554892 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.743133166 +0000 UTC m=+0.834099651,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.562148 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.743158657 +0000 UTC m=+0.834125142,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.573637 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac36419\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.743167767 +0000 UTC m=+0.834134252,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: I0320 10:55:49.573725 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.580569 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.745238053 +0000 UTC m=+0.836204538,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.587701 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.745250403 +0000 UTC m=+0.836216888,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.594588 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac36419\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.745261254 +0000 UTC m=+0.836227739,Count:3,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.600726 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.746727113 +0000 UTC m=+0.837693608,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.607922 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.746754134 +0000 UTC m=+0.837720629,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.617238 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac36419\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.746764914 +0000 UTC m=+0.837731409,Count:4,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.626291 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.747949026 +0000 UTC m=+0.838915511,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.633317 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.747973126 +0000 UTC m=+0.838939611,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.639913 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac36419\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.747987007 +0000 UTC m=+0.838953492,Count:5,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.643621 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.74809731 +0000 UTC m=+0.839063795,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.647198 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.74811453 +0000 UTC m=+0.839081025,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.650244 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac36419\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.748132271 +0000 UTC m=+0.839098756,Count:6,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.654368 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.796175068 +0000 UTC m=+0.887141633,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.660256 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.796209229 +0000 UTC m=+0.887175764,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.664740 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac36419\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac36419 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientPID,Message:Node crc status is now: NodeHasSufficientPID,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632696857 +0000 UTC m=+0.723663352,LastTimestamp:2026-03-20 10:55:14.79623235 +0000 UTC m=+0.887198875,Count:7,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.669508 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac2998d\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac2998d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node crc status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632645005 +0000 UTC m=+0.723611510,LastTimestamp:2026-03-20 10:55:14.7962652 +0000 UTC m=+0.887231685,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.675603 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"crc.189e8757dac321f6\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{crc.189e8757dac321f6 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasNoDiskPressure,Message:Node crc status is now: NodeHasNoDiskPressure,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:14.632679926 +0000 UTC m=+0.723646431,LastTimestamp:2026-03-20 10:55:14.796308302 +0000 UTC m=+0.887274797,Count:8,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.681772 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8757fb2d9f4f openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.176529743 +0000 UTC m=+1.267496228,LastTimestamp:2026-03-20 10:55:15.176529743 +0000 UTC m=+1.267496228,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.688071 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8757fbd23342 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.187315522 +0000 UTC m=+1.278282027,LastTimestamp:2026-03-20 10:55:15.187315522 +0000 UTC m=+1.278282027,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.694375 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8757fd41329b openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.211367067 +0000 UTC m=+1.302333592,LastTimestamp:2026-03-20 10:55:15.211367067 +0000 UTC m=+1.302333592,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.701011 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8757fd90ef2e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.216592686 +0000 UTC m=+1.307559181,LastTimestamp:2026-03-20 10:55:15.216592686 +0000 UTC m=+1.307559181,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.707307 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8757fdd024de openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.220735198 +0000 UTC m=+1.311701693,LastTimestamp:2026-03-20 10:55:15.220735198 +0000 UTC m=+1.311701693,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.714269 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87582239577b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.831609211 +0000 UTC m=+1.922575716,LastTimestamp:2026-03-20 10:55:15.831609211 +0000 UTC m=+1.922575716,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.720742 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e8758223b3fb3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Created,Message:Created container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.831734195 +0000 UTC m=+1.922700720,LastTimestamp:2026-03-20 10:55:15.831734195 +0000 UTC m=+1.922700720,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.726631 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875822430b24 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.832245028 +0000 UTC m=+1.923211543,LastTimestamp:2026-03-20 10:55:15.832245028 +0000 UTC m=+1.923211543,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.732186 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e87582243bba1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Created,Message:Created container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.832290209 +0000 UTC m=+1.923256734,LastTimestamp:2026-03-20 10:55:15.832290209 +0000 UTC m=+1.923256734,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.737142 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e875822469a2e openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Created,Message:Created container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.832478254 +0000 UTC m=+1.923444779,LastTimestamp:2026-03-20 10:55:15.832478254 +0000 UTC m=+1.923444779,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.741782 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8758239c7647 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{wait-for-host-port},},Reason:Started,Message:Started container wait-for-host-port,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.854882375 +0000 UTC m=+1.945848900,LastTimestamp:2026-03-20 10:55:15.854882375 +0000 UTC m=+1.945848900,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.746178 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875823a76133 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager},},Reason:Started,Message:Started container kube-controller-manager,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.855597875 +0000 UTC m=+1.946564390,LastTimestamp:2026-03-20 10:55:15.855597875 +0000 UTC m=+1.946564390,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.750757 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875823b5e366 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.85654871 +0000 UTC m=+1.947515235,LastTimestamp:2026-03-20 10:55:15.85654871 +0000 UTC m=+1.947515235,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.757231 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875823b8c955 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.856738645 +0000 UTC m=+1.947705170,LastTimestamp:2026-03-20 10:55:15.856738645 +0000 UTC m=+1.947705170,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.762768 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e875823c3e366 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{setup},},Reason:Started,Message:Started container setup,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.857466214 +0000 UTC m=+1.948432729,LastTimestamp:2026-03-20 10:55:15.857466214 +0000 UTC m=+1.948432729,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.767507 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875823c8acf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.857779952 +0000 UTC m=+1.948746477,LastTimestamp:2026-03-20 10:55:15.857779952 +0000 UTC m=+1.948746477,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.773666 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87583ade4a28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.245072424 +0000 UTC m=+2.336038949,LastTimestamp:2026-03-20 10:55:16.245072424 +0000 UTC m=+2.336038949,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.780112 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87583bc384aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.260095146 +0000 UTC m=+2.351061661,LastTimestamp:2026-03-20 10:55:16.260095146 +0000 UTC m=+2.351061661,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.785265 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87583be10c3e openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.262030398 +0000 UTC m=+2.352996913,LastTimestamp:2026-03-20 10:55:16.262030398 +0000 UTC m=+2.352996913,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.791416 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87584a30ef6b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Created,Message:Created container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.502146923 +0000 UTC m=+2.593113418,LastTimestamp:2026-03-20 10:55:16.502146923 +0000 UTC m=+2.593113418,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.796517 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87584b0859aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-cert-syncer},},Reason:Started,Message:Started container kube-controller-manager-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.516264362 +0000 UTC m=+2.607230857,LastTimestamp:2026-03-20 10:55:16.516264362 +0000 UTC m=+2.607230857,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.801597 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87584b1bf0d7 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.517548247 +0000 UTC m=+2.608514732,LastTimestamp:2026-03-20 10:55:16.517548247 +0000 UTC m=+2.608514732,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.807327 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8758548e06e8 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.676019944 +0000 UTC m=+2.766986449,LastTimestamp:2026-03-20 10:55:16.676019944 +0000 UTC m=+2.766986449,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.812369 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8758549667b1 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.676569009 +0000 UTC m=+2.767535504,LastTimestamp:2026-03-20 10:55:16.676569009 +0000 UTC m=+2.767535504,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.818559 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758554e93e8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.688638952 +0000 UTC m=+2.779605457,LastTimestamp:2026-03-20 10:55:16.688638952 +0000 UTC m=+2.779605457,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.824662 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e875855ddb1e1 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.698018273 +0000 UTC m=+2.788984768,LastTimestamp:2026-03-20 10:55:16.698018273 +0000 UTC m=+2.788984768,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.829964 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875858f7baf3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Created,Message:Created container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.750056179 +0000 UTC m=+2.841022684,LastTimestamp:2026-03-20 10:55:16.750056179 +0000 UTC m=+2.841022684,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.835099 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875859db70bc openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-controller-manager-recovery-controller},},Reason:Started,Message:Started container kube-controller-manager-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.764979388 +0000 UTC m=+2.855945873,LastTimestamp:2026-03-20 10:55:16.764979388 +0000 UTC m=+2.855945873,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.839713 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8758622c4e08 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Created,Message:Created container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.904496648 +0000 UTC m=+2.995463123,LastTimestamp:2026-03-20 10:55:16.904496648 +0000 UTC m=+2.995463123,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.844413 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8758626d245e openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Created,Message:Created container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.908745822 +0000 UTC m=+2.999712307,LastTimestamp:2026-03-20 10:55:16.908745822 +0000 UTC m=+2.999712307,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.848891 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e8758631fde22 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler},},Reason:Started,Message:Started container kube-scheduler,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.920458786 +0000 UTC m=+3.011425271,LastTimestamp:2026-03-20 10:55:16.920458786 +0000 UTC m=+3.011425271,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.855415 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e87586330ae59 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.921560665 +0000 UTC m=+3.012527150,LastTimestamp:2026-03-20 10:55:16.921560665 +0000 UTC m=+3.012527150,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.861931 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87586342be89 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Started,Message:Started container kube-apiserver,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.922744457 +0000 UTC m=+3.013710942,LastTimestamp:2026-03-20 10:55:16.922744457 +0000 UTC m=+3.013710942,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.866318 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e8758634fa84a openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.92359073 +0000 UTC m=+3.014557215,LastTimestamp:2026-03-20 10:55:16.92359073 +0000 UTC m=+3.014557215,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.870679 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e8758640045c4 openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Created,Message:Created container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.93516538 +0000 UTC m=+3.026131865,LastTimestamp:2026-03-20 10:55:16.93516538 +0000 UTC m=+3.026131865,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.874398 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875864465873 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Created,Message:Created container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.939757683 +0000 UTC m=+3.030724168,LastTimestamp:2026-03-20 10:55:16.939757683 +0000 UTC m=+3.030724168,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.878770 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-machine-config-operator\"" event="&Event{ObjectMeta:{kube-rbac-proxy-crio-crc.189e875865bcd33c openshift-machine-config-operator 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-machine-config-operator,Name:kube-rbac-proxy-crio-crc,UID:d1b160f5dda77d281dd8e69ec8d817f9,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-rbac-proxy-crio},},Reason:Started,Message:Started container kube-rbac-proxy-crio,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.96429958 +0000 UTC m=+3.055266065,LastTimestamp:2026-03-20 10:55:16.96429958 +0000 UTC m=+3.055266065,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.883833 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875865eee5bf openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-ensure-env-vars},},Reason:Started,Message:Started container etcd-ensure-env-vars,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.967581119 +0000 UTC m=+3.058547604,LastTimestamp:2026-03-20 10:55:16.967581119 +0000 UTC m=+3.058547604,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.887704 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e87586ed562af openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Created,Message:Created container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.116904111 +0000 UTC m=+3.207870586,LastTimestamp:2026-03-20 10:55:17.116904111 +0000 UTC m=+3.207870586,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.892462 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87586f74e812 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Created,Message:Created container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.127358482 +0000 UTC m=+3.218324987,LastTimestamp:2026-03-20 10:55:17.127358482 +0000 UTC m=+3.218324987,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.898489 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e87586fc10ef7 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-cert-syncer},},Reason:Started,Message:Started container kube-scheduler-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.132349175 +0000 UTC m=+3.223315680,LastTimestamp:2026-03-20 10:55:17.132349175 +0000 UTC m=+3.223315680,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.903686 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e87586fdf2980 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.134322048 +0000 UTC m=+3.225288533,LastTimestamp:2026-03-20 10:55:17.134322048 +0000 UTC m=+3.225288533,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.908048 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875870696eef openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-syncer},},Reason:Started,Message:Started container kube-apiserver-cert-syncer,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.143383791 +0000 UTC m=+3.234350276,LastTimestamp:2026-03-20 10:55:17.143383791 +0000 UTC m=+3.234350276,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.912171 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875870810ace openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.144931022 +0000 UTC m=+3.235897507,LastTimestamp:2026-03-20 10:55:17.144931022 +0000 UTC m=+3.235897507,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.917037 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e87587d57e3b6 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Created,Message:Created container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.360337846 +0000 UTC m=+3.451304331,LastTimestamp:2026-03-20 10:55:17.360337846 +0000 UTC m=+3.451304331,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.921637 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87587d8aaf93 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Created,Message:Created container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.363666835 +0000 UTC m=+3.454633360,LastTimestamp:2026-03-20 10:55:17.363666835 +0000 UTC m=+3.454633360,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.926176 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-scheduler\"" event="&Event{ObjectMeta:{openshift-kube-scheduler-crc.189e87587e8404f1 openshift-kube-scheduler 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-scheduler,Name:openshift-kube-scheduler-crc,UID:3dcd261975c3d6b9a6ad6367fd4facd3,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-scheduler-recovery-controller},},Reason:Started,Message:Started container kube-scheduler-recovery-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.380007153 +0000 UTC m=+3.470973658,LastTimestamp:2026-03-20 10:55:17.380007153 +0000 UTC m=+3.470973658,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: I0320 10:55:49.927632 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:49 crc kubenswrapper[4772]: I0320 10:55:49.928792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:49 crc kubenswrapper[4772]: I0320 10:55:49.928826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:49 crc kubenswrapper[4772]: I0320 10:55:49.928835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:49 crc kubenswrapper[4772]: I0320 10:55:49.928871 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.929955 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.930497 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87587ead8acc openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-cert-regeneration-controller},},Reason:Started,Message:Started container kube-apiserver-cert-regeneration-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.382728396 +0000 UTC m=+3.473694891,LastTimestamp:2026-03-20 10:55:17.382728396 +0000 UTC m=+3.473694891,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.934320 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87587ebf0fcd openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.383876557 +0000 UTC m=+3.474843042,LastTimestamp:2026-03-20 10:55:17.383876557 +0000 UTC m=+3.474843042,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.934443 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.936123 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87588c15f06d openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Created,Message:Created container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.607673965 +0000 UTC m=+3.698640460,LastTimestamp:2026-03-20 10:55:17.607673965 +0000 UTC m=+3.698640460,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.939295 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87588d1dd6f4 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-insecure-readyz},},Reason:Started,Message:Started container kube-apiserver-insecure-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.624968948 +0000 UTC m=+3.715935423,LastTimestamp:2026-03-20 10:55:17.624968948 +0000 UTC m=+3.715935423,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.943925 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87588d2fef31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.626154801 +0000 UTC m=+3.717121286,LastTimestamp:2026-03-20 10:55:17.626154801 +0000 UTC m=+3.717121286,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.950124 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e875892f59a38 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.722995256 +0000 UTC m=+3.813961731,LastTimestamp:2026-03-20 10:55:17.722995256 +0000 UTC m=+3.813961731,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.951783 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87589b7140f9 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Created,Message:Created container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.865316601 +0000 UTC m=+3.956283086,LastTimestamp:2026-03-20 10:55:17.865316601 +0000 UTC m=+3.956283086,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.956499 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87589c0edf96 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Started,Message:Started container kube-apiserver-check-endpoints,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.875646358 +0000 UTC m=+3.966612833,LastTimestamp:2026-03-20 10:55:17.875646358 +0000 UTC m=+3.966612833,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.961253 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87589efc3880 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Created,Message:Created container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.924755584 +0000 UTC m=+4.015722069,LastTimestamp:2026-03-20 10:55:17.924755584 +0000 UTC m=+4.015722069,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.966055 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758a0951aac openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.initContainers{etcd-resources-copy},},Reason:Started,Message:Started container etcd-resources-copy,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.951552172 +0000 UTC m=+4.042518657,LastTimestamp:2026-03-20 10:55:17.951552172 +0000 UTC m=+4.042518657,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.971115 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758cf1b945b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:18.732117083 +0000 UTC m=+4.823083598,LastTimestamp:2026-03-20 10:55:18.732117083 +0000 UTC m=+4.823083598,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.977741 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758dd88bcd2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Created,Message:Created container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:18.97415189 +0000 UTC m=+5.065118385,LastTimestamp:2026-03-20 10:55:18.97415189 +0000 UTC m=+5.065118385,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.982414 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758de491fed openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcdctl},},Reason:Started,Message:Started container etcdctl,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:18.986760173 +0000 UTC m=+5.077726688,LastTimestamp:2026-03-20 10:55:18.986760173 +0000 UTC m=+5.077726688,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.987669 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758de600ef2 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:18.988263154 +0000 UTC m=+5.079229679,LastTimestamp:2026-03-20 10:55:18.988263154 +0000 UTC m=+5.079229679,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.993154 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758ef8706ce openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Created,Message:Created container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.276029646 +0000 UTC m=+5.366996171,LastTimestamp:2026-03-20 10:55:19.276029646 +0000 UTC m=+5.366996171,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:49 crc kubenswrapper[4772]: E0320 10:55:49.998207 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758f0860f24 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd},},Reason:Started,Message:Started container etcd,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.29274346 +0000 UTC m=+5.383709985,LastTimestamp:2026-03-20 10:55:19.29274346 +0000 UTC m=+5.383709985,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.002965 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758f0a5b410 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.294817296 +0000 UTC m=+5.385783811,LastTimestamp:2026-03-20 10:55:19.294817296 +0000 UTC m=+5.385783811,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.008780 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758fe96ab52 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Created,Message:Created container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.528713042 +0000 UTC m=+5.619679527,LastTimestamp:2026-03-20 10:55:19.528713042 +0000 UTC m=+5.619679527,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.014332 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758ff9fa8de openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-metrics},},Reason:Started,Message:Started container etcd-metrics,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.546079454 +0000 UTC m=+5.637045939,LastTimestamp:2026-03-20 10:55:19.546079454 +0000 UTC m=+5.637045939,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.019038 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e8758ffafc655 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.547135573 +0000 UTC m=+5.638102058,LastTimestamp:2026-03-20 10:55:19.547135573 +0000 UTC m=+5.638102058,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.024393 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87590be7381b openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Created,Message:Created container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.752095771 +0000 UTC m=+5.843062296,LastTimestamp:2026-03-20 10:55:19.752095771 +0000 UTC m=+5.843062296,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.029794 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87590ccbbf75 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-readyz},},Reason:Started,Message:Started container etcd-readyz,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.767072629 +0000 UTC m=+5.858039114,LastTimestamp:2026-03-20 10:55:19.767072629 +0000 UTC m=+5.858039114,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.033887 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87590cecfd92 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:19.769251218 +0000 UTC m=+5.860217703,LastTimestamp:2026-03-20 10:55:19.769251218 +0000 UTC m=+5.860217703,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.038978 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87591b4c58a8 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Created,Message:Created container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:20.01038148 +0000 UTC m=+6.101347965,LastTimestamp:2026-03-20 10:55:20.01038148 +0000 UTC m=+6.101347965,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.043237 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-etcd\"" event="&Event{ObjectMeta:{etcd-crc.189e87591c5da5e1 openshift-etcd 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-etcd,Name:etcd-crc,UID:2139d3e2895fc6797b9c76a1b4c9886d,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{etcd-rev},},Reason:Started,Message:Started container etcd-rev,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:20.028292577 +0000 UTC m=+6.119259092,LastTimestamp:2026-03-20 10:55:20.028292577 +0000 UTC m=+6.119259092,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.049419 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:50 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8759765ec7f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:50 crc kubenswrapper[4772]: body: Mar 20 10:55:50 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:21.538316272 +0000 UTC m=+7.629282787,LastTimestamp:2026-03-20 10:55:21.538316272 +0000 UTC m=+7.629282787,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:50 crc kubenswrapper[4772]: > Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.053301 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87597660284b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:21.538406475 +0000 UTC m=+7.629373000,LastTimestamp:2026-03-20 10:55:21.538406475 +0000 UTC m=+7.629373000,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.055974 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:50 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-apiserver-crc.189e875b51406867 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:55:50 crc kubenswrapper[4772]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:55:50 crc kubenswrapper[4772]: Mar 20 10:55:50 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:29.505503335 +0000 UTC m=+15.596469820,LastTimestamp:2026-03-20 10:55:29.505503335 +0000 UTC m=+15.596469820,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:50 crc kubenswrapper[4772]: > Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.063125 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875b5141190c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:29.505548556 +0000 UTC m=+15.596515041,LastTimestamp:2026-03-20 10:55:29.505548556 +0000 UTC m=+15.596515041,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.069688 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e875b51406867\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:50 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-apiserver-crc.189e875b51406867 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:ProbeError,Message:Startup probe error: HTTP probe failed with statuscode: 403 Mar 20 10:55:50 crc kubenswrapper[4772]: body: {"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Mar 20 10:55:50 crc kubenswrapper[4772]: Mar 20 10:55:50 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:29.505503335 +0000 UTC m=+15.596469820,LastTimestamp:2026-03-20 10:55:29.512534777 +0000 UTC m=+15.603501272,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:50 crc kubenswrapper[4772]: > Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.075991 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e875b5141190c\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875b5141190c openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Startup probe failed: HTTP probe failed with statuscode: 403,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:29.505548556 +0000 UTC m=+15.596515041,LastTimestamp:2026-03-20 10:55:29.512583838 +0000 UTC m=+15.603550333,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.080925 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event=< Mar 20 10:55:50 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-apiserver-crc.189e875b612605ff openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:ProbeError,Message:Readiness probe error: Get "https://192.168.126.11:17697/healthz": dial tcp 192.168.126.11:17697: connect: connection refused Mar 20 10:55:50 crc kubenswrapper[4772]: body: Mar 20 10:55:50 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:29.772209663 +0000 UTC m=+15.863176178,LastTimestamp:2026-03-20 10:55:29.772209663 +0000 UTC m=+15.863176178,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:50 crc kubenswrapper[4772]: > Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.084683 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e875b61272563 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Unhealthy,Message:Readiness probe failed: Get \"https://192.168.126.11:17697/healthz\": dial tcp 192.168.126.11:17697: connect: connection refused,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:29.772283235 +0000 UTC m=+15.863249760,LastTimestamp:2026-03-20 10:55:29.772283235 +0000 UTC m=+15.863249760,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.093569 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-apiserver-crc.189e87588d2fef31\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-apiserver\"" event="&Event{ObjectMeta:{kube-apiserver-crc.189e87588d2fef31 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-crc,UID:f4b27818a5e8e43d0dc095d08835c792,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver-check-endpoints},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:17.626154801 +0000 UTC m=+3.717121286,LastTimestamp:2026-03-20 10:55:29.792908947 +0000 UTC m=+15.883875442,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.101186 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:50 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-controller-manager-crc.189e875bca7879b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:50 crc kubenswrapper[4772]: body: Mar 20 10:55:50 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:31.539220912 +0000 UTC m=+17.630187397,LastTimestamp:2026-03-20 10:55:31.539220912 +0000 UTC m=+17.630187397,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:50 crc kubenswrapper[4772]: > Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.108334 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875bca7986ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:31.539289834 +0000 UTC m=+17.630256319,LastTimestamp:2026-03-20 10:55:31.539289834 +0000 UTC m=+17.630256319,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.113229 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e8759765ec7f0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:50 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-controller-manager-crc.189e8759765ec7f0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": context deadline exceeded (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:50 crc kubenswrapper[4772]: body: Mar 20 10:55:50 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:21.538316272 +0000 UTC m=+7.629282787,LastTimestamp:2026-03-20 10:55:41.919639379 +0000 UTC m=+28.010605864,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:50 crc kubenswrapper[4772]: > Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.117073 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87597660284b\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87597660284b openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:21.538406475 +0000 UTC m=+7.629373000,LastTimestamp:2026-03-20 10:55:41.91967488 +0000 UTC m=+28.010641365,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.123071 4772 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875e35549ce3 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Killing,Message:Container cluster-policy-controller failed startup probe, will be restarted,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:41.921967331 +0000 UTC m=+28.012933816,LastTimestamp:2026-03-20 10:55:41.921967331 +0000 UTC m=+28.012933816,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.127755 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e875823c8acf0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875823c8acf0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Pulled,Message:Container image \"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:15.857779952 +0000 UTC m=+1.948746477,LastTimestamp:2026-03-20 10:55:42.044896307 +0000 UTC m=+28.135862802,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.133527 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87583ade4a28\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87583ade4a28 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Created,Message:Created container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.245072424 +0000 UTC m=+2.336038949,LastTimestamp:2026-03-20 10:55:42.282273207 +0000 UTC m=+28.373239712,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: E0320 10:55:50.138814 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e87583bc384aa\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e87583bc384aa openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Started,Message:Started container cluster-policy-controller,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:16.260095146 +0000 UTC m=+2.351061661,LastTimestamp:2026-03-20 10:55:42.294550411 +0000 UTC m=+28.385516906,Count:2,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.575598 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.641709 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.643346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.643572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.643708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.644688 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.881696 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.885149 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a"} Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.885383 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.886471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.886512 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:50 crc kubenswrapper[4772]: I0320 10:55:50.886527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:51 crc kubenswrapper[4772]: I0320 10:55:51.539531 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:55:51 crc kubenswrapper[4772]: I0320 10:55:51.539694 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:55:51 crc kubenswrapper[4772]: E0320 10:55:51.544907 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e875bca7879b0\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event=< Mar 20 10:55:51 crc kubenswrapper[4772]: &Event{ObjectMeta:{kube-controller-manager-crc.189e875bca7879b0 openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:ProbeError,Message:Startup probe error: Get "https://192.168.126.11:10357/healthz": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers) Mar 20 10:55:51 crc kubenswrapper[4772]: body: Mar 20 10:55:51 crc kubenswrapper[4772]: ,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:31.539220912 +0000 UTC m=+17.630187397,LastTimestamp:2026-03-20 10:55:51.539644382 +0000 UTC m=+37.630610877,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,} Mar 20 10:55:51 crc kubenswrapper[4772]: > Mar 20 10:55:51 crc kubenswrapper[4772]: E0320 10:55:51.554963 4772 event.go:359] "Server rejected event (will not retry!)" err="events \"kube-controller-manager-crc.189e875bca7986ea\" is forbidden: User \"system:anonymous\" cannot patch resource \"events\" in API group \"\" in the namespace \"openshift-kube-controller-manager\"" event="&Event{ObjectMeta:{kube-controller-manager-crc.189e875bca7986ea openshift-kube-controller-manager 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-controller-manager,Name:kube-controller-manager-crc,UID:f614b9022728cf315e60c057852e563e,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{cluster-policy-controller},},Reason:Unhealthy,Message:Startup probe failed: Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers),Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 10:55:31.539289834 +0000 UTC m=+17.630256319,LastTimestamp:2026-03-20 10:55:51.539730855 +0000 UTC m=+37.630697350,Count:2,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 10:55:51 crc kubenswrapper[4772]: I0320 10:55:51.572691 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.570868 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.891386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.892143 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/1.log" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.894295 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a" exitCode=255 Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.894345 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a"} Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.894386 4772 scope.go:117] "RemoveContainer" containerID="4fe9c1ace7cec4dbcccf4569de79638e5a6da8681c0d1f9887caec765dc7767d" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.894567 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.895885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.895935 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.895981 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:52 crc kubenswrapper[4772]: I0320 10:55:52.896740 4772 scope.go:117] "RemoveContainer" containerID="7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a" Mar 20 10:55:52 crc kubenswrapper[4772]: E0320 10:55:52.896978 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:53 crc kubenswrapper[4772]: I0320 10:55:53.574365 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:53 crc kubenswrapper[4772]: W0320 10:55:53.633890 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User "system:anonymous" cannot list resource "runtimeclasses" in API group "node.k8s.io" at the cluster scope Mar 20 10:55:53 crc kubenswrapper[4772]: E0320 10:55:53.633966 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: runtimeclasses.node.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"runtimeclasses\" in API group \"node.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:55:53 crc kubenswrapper[4772]: I0320 10:55:53.899629 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.404876 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.422937 4772 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.573471 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:54 crc kubenswrapper[4772]: W0320 10:55:54.606362 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User "system:anonymous" cannot list resource "csidrivers" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:54 crc kubenswrapper[4772]: E0320 10:55:54.606447 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.748104 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.748285 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.749422 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.749454 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.749466 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:54 crc kubenswrapper[4772]: I0320 10:55:54.750014 4772 scope.go:117] "RemoveContainer" containerID="7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a" Mar 20 10:55:54 crc kubenswrapper[4772]: E0320 10:55:54.750183 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:55:54 crc kubenswrapper[4772]: E0320 10:55:54.751695 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:55:55 crc kubenswrapper[4772]: I0320 10:55:55.575474 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:56 crc kubenswrapper[4772]: W0320 10:55:56.517323 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: services is forbidden: User "system:anonymous" cannot list resource "services" in API group "" at the cluster scope Mar 20 10:55:56 crc kubenswrapper[4772]: E0320 10:55:56.517366 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 10:55:56 crc kubenswrapper[4772]: I0320 10:55:56.571265 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:56 crc kubenswrapper[4772]: I0320 10:55:56.930097 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:56 crc kubenswrapper[4772]: I0320 10:55:56.931784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:56 crc kubenswrapper[4772]: I0320 10:55:56.931893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:56 crc kubenswrapper[4772]: I0320 10:55:56.931914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:56 crc kubenswrapper[4772]: I0320 10:55:56.931951 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:55:56 crc kubenswrapper[4772]: E0320 10:55:56.939391 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:55:56 crc kubenswrapper[4772]: E0320 10:55:56.939531 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:55:57 crc kubenswrapper[4772]: I0320 10:55:57.570748 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.545117 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.545792 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.548116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.548172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.548187 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.553958 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.573313 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.915195 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.916233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.916373 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:58 crc kubenswrapper[4772]: I0320 10:55:58.916443 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.574130 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:55:59 crc kubenswrapper[4772]: W0320 10:55:59.762778 4772 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: nodes "crc" is forbidden: User "system:anonymous" cannot list resource "nodes" in API group "" at the cluster scope Mar 20 10:55:59 crc kubenswrapper[4772]: E0320 10:55:59.762882 4772 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: nodes \"crc\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.771362 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.771613 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.773131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.773199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.773216 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:55:59 crc kubenswrapper[4772]: I0320 10:55:59.773913 4772 scope.go:117] "RemoveContainer" containerID="7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a" Mar 20 10:55:59 crc kubenswrapper[4772]: E0320 10:55:59.774165 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:00 crc kubenswrapper[4772]: I0320 10:56:00.573389 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:01 crc kubenswrapper[4772]: I0320 10:56:01.571726 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:02 crc kubenswrapper[4772]: I0320 10:56:02.570698 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:03 crc kubenswrapper[4772]: I0320 10:56:03.571632 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:03 crc kubenswrapper[4772]: I0320 10:56:03.939805 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:03 crc kubenswrapper[4772]: I0320 10:56:03.941260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:03 crc kubenswrapper[4772]: I0320 10:56:03.941320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:03 crc kubenswrapper[4772]: I0320 10:56:03.941340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:03 crc kubenswrapper[4772]: I0320 10:56:03.941372 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:56:03 crc kubenswrapper[4772]: E0320 10:56:03.946771 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:56:03 crc kubenswrapper[4772]: E0320 10:56:03.946974 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:56:04 crc kubenswrapper[4772]: I0320 10:56:04.578597 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:04 crc kubenswrapper[4772]: E0320 10:56:04.752021 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:56:05 crc kubenswrapper[4772]: I0320 10:56:05.576088 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:06 crc kubenswrapper[4772]: I0320 10:56:06.572358 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:07 crc kubenswrapper[4772]: I0320 10:56:07.145889 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Mar 20 10:56:07 crc kubenswrapper[4772]: I0320 10:56:07.146090 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:07 crc kubenswrapper[4772]: I0320 10:56:07.147564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:07 crc kubenswrapper[4772]: I0320 10:56:07.147639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:07 crc kubenswrapper[4772]: I0320 10:56:07.147661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:07 crc kubenswrapper[4772]: I0320 10:56:07.574664 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:08 crc kubenswrapper[4772]: I0320 10:56:08.573955 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:09 crc kubenswrapper[4772]: I0320 10:56:09.574701 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:10 crc kubenswrapper[4772]: I0320 10:56:10.573063 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:10 crc kubenswrapper[4772]: I0320 10:56:10.947403 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:10 crc kubenswrapper[4772]: I0320 10:56:10.949502 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:10 crc kubenswrapper[4772]: I0320 10:56:10.949756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:10 crc kubenswrapper[4772]: I0320 10:56:10.950272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:10 crc kubenswrapper[4772]: I0320 10:56:10.950639 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:56:10 crc kubenswrapper[4772]: E0320 10:56:10.955612 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:56:10 crc kubenswrapper[4772]: E0320 10:56:10.955767 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:56:11 crc kubenswrapper[4772]: I0320 10:56:11.573499 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:12 crc kubenswrapper[4772]: I0320 10:56:12.574027 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.573522 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.641599 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.643285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.643352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.643377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.644310 4772 scope.go:117] "RemoveContainer" containerID="7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.960179 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.961626 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5"} Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.961764 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.962780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.962827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:13 crc kubenswrapper[4772]: I0320 10:56:13.962866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.571061 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:14 crc kubenswrapper[4772]: E0320 10:56:14.752552 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.967339 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.967902 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/2.log" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.970315 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" exitCode=255 Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.970383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5"} Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.970440 4772 scope.go:117] "RemoveContainer" containerID="7752e1edc41f1d08085a02ca94e8ff6a4ee7ca0b6050603f5689c343614dd59a" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.970709 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.971914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.971944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.971956 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:14 crc kubenswrapper[4772]: I0320 10:56:14.973477 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:56:14 crc kubenswrapper[4772]: E0320 10:56:14.973738 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:15 crc kubenswrapper[4772]: I0320 10:56:15.573256 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:15 crc kubenswrapper[4772]: I0320 10:56:15.975929 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:56:16 crc kubenswrapper[4772]: I0320 10:56:16.576119 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:17 crc kubenswrapper[4772]: I0320 10:56:17.572592 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:17 crc kubenswrapper[4772]: I0320 10:56:17.956077 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:17 crc kubenswrapper[4772]: I0320 10:56:17.957865 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:17 crc kubenswrapper[4772]: I0320 10:56:17.957930 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:17 crc kubenswrapper[4772]: I0320 10:56:17.957949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:17 crc kubenswrapper[4772]: I0320 10:56:17.957982 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:56:17 crc kubenswrapper[4772]: E0320 10:56:17.961905 4772 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes is forbidden: User \"system:anonymous\" cannot create resource \"nodes\" in API group \"\" at the cluster scope" node="crc" Mar 20 10:56:17 crc kubenswrapper[4772]: E0320 10:56:17.961984 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"crc\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="7s" Mar 20 10:56:18 crc kubenswrapper[4772]: I0320 10:56:18.571381 4772 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "crc" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.405575 4772 csr.go:261] certificate signing request csr-nlgzd is approved, waiting to be issued Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.413181 4772 csr.go:257] certificate signing request csr-nlgzd is issued Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.486880 4772 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.772035 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.772237 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.773382 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.773433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.773446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:19 crc kubenswrapper[4772]: I0320 10:56:19.774089 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:56:19 crc kubenswrapper[4772]: E0320 10:56:19.774302 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:20 crc kubenswrapper[4772]: I0320 10:56:20.396330 4772 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Mar 20 10:56:20 crc kubenswrapper[4772]: I0320 10:56:20.414624 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2027-01-14 20:35:57.268654759 +0000 UTC Mar 20 10:56:20 crc kubenswrapper[4772]: I0320 10:56:20.414716 4772 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7209h39m36.853951157s for next certificate rotation Mar 20 10:56:22 crc kubenswrapper[4772]: I0320 10:56:22.387629 4772 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.748636 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.749684 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.751222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.751265 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.751276 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.751893 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:56:24 crc kubenswrapper[4772]: E0320 10:56:24.752060 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:24 crc kubenswrapper[4772]: E0320 10:56:24.752823 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.962492 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.964213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.964263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.964278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.964433 4772 kubelet_node_status.go:76] "Attempting to register node" node="crc" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.973052 4772 kubelet_node_status.go:115] "Node was previously registered" node="crc" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.973365 4772 kubelet_node_status.go:79] "Successfully registered node" node="crc" Mar 20 10:56:24 crc kubenswrapper[4772]: E0320 10:56:24.973394 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.980283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.980319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.980333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.980366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:24 crc kubenswrapper[4772]: I0320 10:56:24.980381 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:24Z","lastTransitionTime":"2026-03-20T10:56:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:24 crc kubenswrapper[4772]: E0320 10:56:24.996581 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:24Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.001631 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.001685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.001707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.001736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.001757 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.018320 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.025675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.025736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.025754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.025780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.025801 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.040580 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.044762 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.044903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.044923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.044955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:25 crc kubenswrapper[4772]: I0320 10:56:25.044976 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:25Z","lastTransitionTime":"2026-03-20T10:56:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.055662 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:25Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.055782 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.055815 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.156125 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.256336 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.356740 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.457237 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.558287 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.658892 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.759603 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.860648 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:25 crc kubenswrapper[4772]: E0320 10:56:25.961798 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.062213 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.163144 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.264045 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.365051 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.466074 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.566334 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: I0320 10:56:26.641669 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:26 crc kubenswrapper[4772]: I0320 10:56:26.642993 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:26 crc kubenswrapper[4772]: I0320 10:56:26.643023 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:26 crc kubenswrapper[4772]: I0320 10:56:26.643034 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.666511 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.767289 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.868124 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:26 crc kubenswrapper[4772]: E0320 10:56:26.969107 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.069685 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.170041 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.270127 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.371017 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.471541 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.572601 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.673277 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.774258 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.874819 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:27 crc kubenswrapper[4772]: E0320 10:56:27.975243 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.075792 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.175935 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.277140 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.377761 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.478789 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.578904 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.680042 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.780558 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.880773 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:28 crc kubenswrapper[4772]: E0320 10:56:28.981434 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.082063 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.182607 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.283526 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.383727 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.484240 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.584608 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.685441 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.786376 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.886808 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:29 crc kubenswrapper[4772]: E0320 10:56:29.987875 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.088023 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.188722 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.289143 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.389378 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.490634 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.591711 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.692864 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.793695 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.894301 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:30 crc kubenswrapper[4772]: E0320 10:56:30.994482 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.095398 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.196708 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.297233 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.397390 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.498428 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.598806 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.699162 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.799324 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:31 crc kubenswrapper[4772]: E0320 10:56:31.900165 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.000907 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.101771 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.202925 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.303688 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.404134 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.504652 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.604947 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.705830 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.806048 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:32 crc kubenswrapper[4772]: E0320 10:56:32.906881 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.007511 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.108492 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.209297 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.309938 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.410467 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.511304 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.611784 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.712947 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.813813 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:33 crc kubenswrapper[4772]: E0320 10:56:33.914692 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.015249 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.116163 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.217110 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.318071 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.419045 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.520098 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.621204 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.721387 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.754276 4772 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.821832 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:34 crc kubenswrapper[4772]: E0320 10:56:34.922537 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.023677 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.124103 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.224928 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.325615 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.398306 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="error getting node \"crc\": node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.404242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.404309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.404331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.404359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.404384 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.421010 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.426328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.426374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.426389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.426412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.426427 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.441389 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.446210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.446247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.446258 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.446272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.446282 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.459791 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.464490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.464541 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.464558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.464583 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:35 crc kubenswrapper[4772]: I0320 10:56:35.464601 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:35Z","lastTransitionTime":"2026-03-20T10:56:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.481012 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:35Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.481338 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.481406 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.581909 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.682808 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.783159 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.883456 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:35 crc kubenswrapper[4772]: E0320 10:56:35.984395 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.085643 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.186341 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.286973 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.387906 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.488261 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.589252 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.689480 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.790128 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.890711 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:36 crc kubenswrapper[4772]: E0320 10:56:36.990884 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.092009 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.192925 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.293900 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.395034 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.495242 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.595912 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: I0320 10:56:37.641473 4772 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Mar 20 10:56:37 crc kubenswrapper[4772]: I0320 10:56:37.643324 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:37 crc kubenswrapper[4772]: I0320 10:56:37.643371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:37 crc kubenswrapper[4772]: I0320 10:56:37.643395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:37 crc kubenswrapper[4772]: I0320 10:56:37.644352 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.644719 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.697183 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.798154 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.898737 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:37 crc kubenswrapper[4772]: E0320 10:56:37.999564 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:38 crc kubenswrapper[4772]: E0320 10:56:38.100551 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:38 crc kubenswrapper[4772]: E0320 10:56:38.200756 4772 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.247701 4772 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.304077 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.304139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.304157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.304183 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.304202 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.406706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.406775 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.406798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.406876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.406896 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.510501 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.510561 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.510578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.510602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.510621 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.613606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.613649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.613660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.613675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.613692 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.716703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.716760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.716786 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.716813 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.716829 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.819129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.819197 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.819214 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.819233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.819246 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.922878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.922938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.922955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.922978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.923004 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:38Z","lastTransitionTime":"2026-03-20T10:56:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.953215 4772 apiserver.go:52] "Watching apiserver" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.959212 4772 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.959684 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb","openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-dns/node-resolver-k4qd4","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf"] Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.960213 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.960315 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.960381 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.960506 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:38 crc kubenswrapper[4772]: E0320 10:56:38.960495 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.960540 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:38 crc kubenswrapper[4772]: E0320 10:56:38.960941 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.961289 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.961360 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:38 crc kubenswrapper[4772]: E0320 10:56:38.961479 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.963544 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.964884 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.964920 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.965016 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.965778 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.965786 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.965910 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.965989 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.966134 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.966289 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.967038 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.967056 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.971090 4772 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.981380 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:38 crc kubenswrapper[4772]: I0320 10:56:38.997367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.009780 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.009818 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010063 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010096 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010123 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010153 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010179 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010258 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010279 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010305 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010324 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010346 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010365 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010382 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010401 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010420 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010437 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010460 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010479 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010497 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010515 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010553 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010570 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010585 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010603 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010621 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010669 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010683 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010699 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010715 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010742 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010779 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.010994 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011016 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011078 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011136 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011161 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011095 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011184 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011259 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011362 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011392 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011417 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011443 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011467 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011491 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011512 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011532 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011543 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011586 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011583 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011640 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011786 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011806 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.012751 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.012804 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.012952 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013029 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013090 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013225 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013261 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013525 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.011602 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013630 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013689 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013759 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013706 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013809 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013934 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.013988 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014048 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014097 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014144 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014195 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014204 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014256 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014289 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014306 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014359 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014405 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014429 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014451 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014454 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014475 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014496 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014550 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014604 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014631 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014648 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014709 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014707 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014762 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014810 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014900 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014955 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015006 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015058 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015107 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015164 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015213 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015259 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015314 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015359 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015404 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015454 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015503 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015554 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015605 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015654 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015701 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015747 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015795 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016090 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016161 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016211 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016256 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016380 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016431 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016481 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016545 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016594 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016646 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016747 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016801 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016888 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016948 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016998 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017048 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017149 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017234 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017299 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017355 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017403 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017455 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017569 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017618 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017665 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017717 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017767 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017815 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017906 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017960 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018009 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018059 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018108 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018162 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018223 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018277 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018324 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018386 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018425 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018458 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018512 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018547 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018583 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018618 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018652 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018688 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018722 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018756 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018790 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018827 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018936 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018973 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019010 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019078 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019114 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019149 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019194 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019242 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019293 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019396 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019455 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019513 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.020974 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021043 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021088 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021124 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021158 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014815 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014863 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.014758 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015165 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015182 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015449 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015682 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015677 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021525 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015716 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015770 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.015955 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016227 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016199 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016308 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016387 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016409 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016435 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016615 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016714 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.016949 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017103 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017200 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.017337 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.018544 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019218 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.019662 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.020133 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021648 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021775 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.022167 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.022179 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.023550 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.022636 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.023790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.023610 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.022674 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.022709 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.022734 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.023464 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.023593 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.023631 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.024092 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.024538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.024562 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.021186 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.024807 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025013 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025062 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025089 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025100 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025138 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025208 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025323 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025375 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025413 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025449 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025481 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025480 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025515 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025547 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025580 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025615 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025670 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025728 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025768 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025806 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025866 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025902 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025914 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025984 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025974 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.026037 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.026724 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.025937 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027130 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027154 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027175 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027195 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027214 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027233 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027253 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027274 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027294 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027356 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027363 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027577 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027645 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027884 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028004 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.027917 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028089 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028155 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028209 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028220 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028355 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028446 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028482 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028610 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028655 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028728 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7e1e7e8-3b09-4e05-a31d-a74713a885f3-hosts-file\") pod \"node-resolver-k4qd4\" (UID: \"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\") " pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028754 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028786 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028825 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028894 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.028900 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029087 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029151 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029172 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029193 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029214 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029210 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029237 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029307 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030126 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qn484\" (UniqueName: \"kubernetes.io/projected/f7e1e7e8-3b09-4e05-a31d-a74713a885f3-kube-api-access-qn484\") pod \"node-resolver-k4qd4\" (UID: \"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\") " pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030223 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030419 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030557 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030610 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030655 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030685 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030713 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030742 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030770 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030797 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030826 4772 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030890 4772 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030918 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030956 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030983 4772 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031010 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031038 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031066 4772 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031095 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031121 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029472 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029519 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.029573 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030029 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030159 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.030580 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031007 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031047 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031062 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031109 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.031205 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031546 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.031220 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:39.531180579 +0000 UTC m=+85.622147254 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.031663 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:39.531634073 +0000 UTC m=+85.622600598 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.031835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.032269 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.032485 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033150 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033175 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033193 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033252 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.033486 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.034913 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.036283 4772 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.040568 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.040631 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.040944 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.041068 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.041410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.041942 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.041975 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.041984 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.042530 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.042589 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.042715 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.042824 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.042923 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.043393 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.043467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044311 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044339 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044596 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044680 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044786 4772 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044822 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044871 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044894 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044920 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044939 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044960 4772 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.044981 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045007 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045031 4772 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045052 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045083 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045115 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045142 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045169 4772 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045197 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045225 4772 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045253 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045284 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045312 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045339 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045367 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045394 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045422 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045489 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045519 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045552 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045580 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045608 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045687 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045712 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045732 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045755 4772 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045774 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045797 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045818 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045838 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045887 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045944 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045965 4772 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045983 4772 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046003 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046022 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046041 4772 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046061 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046081 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046102 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046124 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046144 4772 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046165 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046186 4772 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046214 4772 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046243 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046263 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046283 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046405 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046432 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046452 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046472 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046493 4772 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046513 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046533 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046555 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046574 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046599 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046618 4772 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046640 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046671 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046691 4772 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046711 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046730 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046749 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046769 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046794 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046813 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.046832 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.047553 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.044606 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.047733 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:39.547702843 +0000 UTC m=+85.638669358 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045580 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.045861 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.046039 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.047811 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.047751 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.047364 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.047833 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.047432 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.047501 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.047410 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.048022 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:39.547997371 +0000 UTC m=+85.638963896 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.048023 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.048086 4772 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.048119 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.048144 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.055394 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.055953 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056072 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056178 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056272 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056415 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056510 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056551 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056672 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.056990 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.057069 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.057467 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.057526 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058437 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058578 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058632 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058738 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058765 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058806 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058923 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.058967 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.059484 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.059582 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.059734 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.060195 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.060366 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.060867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.061130 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.059261 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.060830 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.060827 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.063666 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.068551 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.068593 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.068615 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.068708 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:39.56867334 +0000 UTC m=+85.659639855 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.070950 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.075368 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.075568 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.075400 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.075819 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.075790 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.076177 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.076699 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.076746 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.080596 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.080758 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.080908 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.081758 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.081784 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.082015 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.081907 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.081902 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.082402 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.082577 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.082811 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.083005 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.083184 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.084373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.084422 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.084468 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.085719 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.093117 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.099160 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.099977 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.103282 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.137219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.137255 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.137264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.137278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.137288 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149051 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7e1e7e8-3b09-4e05-a31d-a74713a885f3-hosts-file\") pod \"node-resolver-k4qd4\" (UID: \"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\") " pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149125 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149245 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149264 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149335 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qn484\" (UniqueName: \"kubernetes.io/projected/f7e1e7e8-3b09-4e05-a31d-a74713a885f3-kube-api-access-qn484\") pod \"node-resolver-k4qd4\" (UID: \"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\") " pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149417 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149439 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149440 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149458 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149532 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149557 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149576 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149607 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149625 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149634 4772 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149645 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149655 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149638 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/f7e1e7e8-3b09-4e05-a31d-a74713a885f3-hosts-file\") pod \"node-resolver-k4qd4\" (UID: \"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\") " pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149666 4772 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149707 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149719 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149733 4772 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149745 4772 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149763 4772 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149777 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149787 4772 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149797 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149806 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149815 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149825 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149865 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149881 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149892 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149903 4772 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149917 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149927 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149937 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149948 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149957 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149967 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149977 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.149987 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150000 4772 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150010 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150021 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150031 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150040 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150050 4772 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150059 4772 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150069 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150079 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150089 4772 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150098 4772 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150107 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150117 4772 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150127 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150137 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150146 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150157 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150167 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150215 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150255 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150276 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150296 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150315 4772 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150332 4772 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150349 4772 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150367 4772 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150384 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150402 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150421 4772 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150439 4772 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150456 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150474 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150491 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150508 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150525 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150543 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150562 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150628 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150653 4772 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150675 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150694 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150715 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150736 4772 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150757 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150778 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150798 4772 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150818 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150840 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150887 4772 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150905 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150921 4772 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150939 4772 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150958 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150975 4772 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.150993 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.151010 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.151027 4772 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.151045 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.171356 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qn484\" (UniqueName: \"kubernetes.io/projected/f7e1e7e8-3b09-4e05-a31d-a74713a885f3-kube-api-access-qn484\") pod \"node-resolver-k4qd4\" (UID: \"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\") " pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.240652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.240716 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.240728 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.240745 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.240756 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.248250 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-ltsw5"] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.249298 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-7fpq9"] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.249433 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.249527 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.251120 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-tmktf"] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.251946 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.252102 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.252379 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.252647 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.252765 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.252806 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.253375 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.253650 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.253893 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.253956 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.254237 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.254706 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.255119 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.267364 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.279529 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.281678 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.291576 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.293679 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.305096 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.305213 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-8f5d7498bea8ad4b2c46b0ff4dd358e77088a8f1d10251e20e8f907e368aee59 WatchSource:0}: Error finding container 8f5d7498bea8ad4b2c46b0ff4dd358e77088a8f1d10251e20e8f907e368aee59: Status 404 returned error can't find the container with id 8f5d7498bea8ad4b2c46b0ff4dd358e77088a8f1d10251e20e8f907e368aee59 Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.306685 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.313435 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.314463 4772 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.324258 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-k4qd4" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.326074 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.332006 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-a94f9ff87a3335ab35e2f9534ac91beec26d3d7fbc63812801ca72473ccfe32a WatchSource:0}: Error finding container a94f9ff87a3335ab35e2f9534ac91beec26d3d7fbc63812801ca72473ccfe32a: Status 404 returned error can't find the container with id a94f9ff87a3335ab35e2f9534ac91beec26d3d7fbc63812801ca72473ccfe32a Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.336533 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.342600 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.342647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.342659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.342678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.342691 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.347002 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.347643 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf7e1e7e8_3b09_4e05_a31d_a74713a885f3.slice/crio-692050369f482da73c5a47b6768f7df2bc2a7abfc8ba6040748bdcabd810d130 WatchSource:0}: Error finding container 692050369f482da73c5a47b6768f7df2bc2a7abfc8ba6040748bdcabd810d130: Status 404 returned error can't find the container with id 692050369f482da73c5a47b6768f7df2bc2a7abfc8ba6040748bdcabd810d130 Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.354827 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.354988 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-rootfs\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.355805 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-netns\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.355847 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-system-cni-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.355871 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-cni-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.355911 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-cni-bin\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.355952 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-multus-certs\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.355988 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xmwgg\" (UniqueName: \"kubernetes.io/projected/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-kube-api-access-xmwgg\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356017 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cnibin\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356047 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356071 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-conf-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356093 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-cnibin\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356117 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-cni-binary-copy\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356179 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356259 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wfrr\" (UniqueName: \"kubernetes.io/projected/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-kube-api-access-5wfrr\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356294 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-k8s-cni-cncf-io\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356321 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-daemon-config\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-proxy-tls\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356389 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-mcd-auth-proxy-config\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356593 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-socket-dir-parent\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356659 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-kubelet\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356682 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-etc-kubernetes\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356706 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-system-cni-dir\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-os-release\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356754 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d5pfq\" (UniqueName: \"kubernetes.io/projected/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-kube-api-access-d5pfq\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356826 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-os-release\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356914 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-hostroot\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356936 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.356962 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-cni-multus\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.365657 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.374267 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.383721 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.395795 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.405563 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.416702 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.425227 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.436604 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.448767 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.449065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.449135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.449147 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.449162 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.449174 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.457760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.457951 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-k8s-cni-cncf-io\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.457988 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-daemon-config\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458013 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wfrr\" (UniqueName: \"kubernetes.io/projected/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-kube-api-access-5wfrr\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458049 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-socket-dir-parent\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458072 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-kubelet\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458096 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-etc-kubernetes\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458111 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-socket-dir-parent\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458118 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-system-cni-dir\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458072 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-k8s-cni-cncf-io\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458152 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-kubelet\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458157 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-system-cni-dir\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458183 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-proxy-tls\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458203 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-etc-kubernetes\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458205 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-mcd-auth-proxy-config\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458294 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-os-release\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d5pfq\" (UniqueName: \"kubernetes.io/projected/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-kube-api-access-d5pfq\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458345 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-os-release\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458362 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-hostroot\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458377 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458397 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-cni-multus\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458447 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-rootfs\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458471 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-os-release\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458493 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-netns\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458511 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-system-cni-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458526 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-cni-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458542 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-cni-bin\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458559 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-multus-certs\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458556 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-os-release\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458576 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xmwgg\" (UniqueName: \"kubernetes.io/projected/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-kube-api-access-xmwgg\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458591 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cnibin\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-conf-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458639 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-cnibin\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458654 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-cni-binary-copy\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458672 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458797 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-cni-bin\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458839 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-hostroot\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.458955 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-mcd-auth-proxy-config\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459163 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-multus-certs\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459156 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-conf-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-cnibin\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459284 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-run-netns\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459315 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-host-var-lib-cni-multus\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459346 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-rootfs\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459380 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cnibin\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459465 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-system-cni-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459480 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459526 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-tuning-conf-dir\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459555 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-cni-dir\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-cni-binary-copy\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.459810 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-cni-binary-copy\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.460303 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-multus-daemon-config\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.465068 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-proxy-tls\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.473179 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d5pfq\" (UniqueName: \"kubernetes.io/projected/ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e-kube-api-access-d5pfq\") pod \"machine-config-daemon-ltsw5\" (UID: \"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\") " pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.473398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wfrr\" (UniqueName: \"kubernetes.io/projected/8e02a490-1cd4-40f2-baeb-f04ce5317e4d-kube-api-access-5wfrr\") pod \"multus-additional-cni-plugins-tmktf\" (UID: \"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\") " pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.475767 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xmwgg\" (UniqueName: \"kubernetes.io/projected/a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d-kube-api-access-xmwgg\") pod \"multus-7fpq9\" (UID: \"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\") " pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.556360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.556397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.556405 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.556418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.556426 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.559047 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.559104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.559126 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.559148 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559197 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:40.559175544 +0000 UTC m=+86.650142029 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559234 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559272 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:40.559261096 +0000 UTC m=+86.650227581 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559289 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559329 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559349 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559361 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559461 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:40.559323968 +0000 UTC m=+86.650290453 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.559489 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:40.559476142 +0000 UTC m=+86.650442627 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.563416 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.570561 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-7fpq9" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.574292 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-tmktf" Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.574491 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea07e2c1_7a61_4afd_97b4_22f8f9dc5c3e.slice/crio-77356b95c6a559131accb4351a915678a46449ab22c83daaa633c5c90ac6242b WatchSource:0}: Error finding container 77356b95c6a559131accb4351a915678a46449ab22c83daaa633c5c90ac6242b: Status 404 returned error can't find the container with id 77356b95c6a559131accb4351a915678a46449ab22c83daaa633c5c90ac6242b Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.586131 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda04aeeb5_2fa5_4466_ac01_e8d9fb19a88d.slice/crio-07fa1adbc89d0d3ff9345bb70eef0dcb8cabbcf0684491e62f5ebe82286be361 WatchSource:0}: Error finding container 07fa1adbc89d0d3ff9345bb70eef0dcb8cabbcf0684491e62f5ebe82286be361: Status 404 returned error can't find the container with id 07fa1adbc89d0d3ff9345bb70eef0dcb8cabbcf0684491e62f5ebe82286be361 Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.591647 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e02a490_1cd4_40f2_baeb_f04ce5317e4d.slice/crio-85db403002d9e446cd514e60dbb0756f499c10ada49b16eba7e5cd94d3c07868 WatchSource:0}: Error finding container 85db403002d9e446cd514e60dbb0756f499c10ada49b16eba7e5cd94d3c07868: Status 404 returned error can't find the container with id 85db403002d9e446cd514e60dbb0756f499c10ada49b16eba7e5cd94d3c07868 Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.622999 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8p9x"] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.623714 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.632818 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.632821 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.633070 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.633100 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.633495 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.633573 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.633901 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.644371 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.656559 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.658037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.658084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.658099 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.658121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.658136 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.659560 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.659687 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.659703 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.659712 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: E0320 10:56:39.659746 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:40.659734627 +0000 UTC m=+86.750701112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.669228 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.681666 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.693594 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.702497 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.714525 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.727341 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.736782 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.745377 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760251 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-node-log\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760287 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-script-lib\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760335 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-bin\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760355 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-systemd-units\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760374 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-netd\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760504 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-etc-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-ovn\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760611 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-log-socket\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760661 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760686 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62da04c-5422-4320-9352-8959b89501be-ovn-node-metrics-cert\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760709 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-kubelet\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760756 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-var-lib-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760677 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760795 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-js95g\" (UniqueName: \"kubernetes.io/projected/d62da04c-5422-4320-9352-8959b89501be-kube-api-access-js95g\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760888 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-slash\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-netns\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.760976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-config\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761037 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-systemd\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761115 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761143 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-env-overrides\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761413 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761460 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.761504 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.862503 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-systemd\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.862565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.862614 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-env-overrides\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.862642 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-script-lib\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.862615 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-ovn-kubernetes\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.862613 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-systemd\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863281 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-env-overrides\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-node-log\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863373 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-bin\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863487 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-systemd-units\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863511 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863569 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863411 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-node-log\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863379 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-script-lib\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863557 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-systemd-units\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863587 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-netd\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863611 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-netd\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-etc-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863689 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-ovn\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-log-socket\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863736 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863738 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-ovn\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-etc-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-log-socket\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863763 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62da04c-5422-4320-9352-8959b89501be-ovn-node-metrics-cert\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863428 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-bin\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863797 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863821 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-kubelet\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863859 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-var-lib-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-js95g\" (UniqueName: \"kubernetes.io/projected/d62da04c-5422-4320-9352-8959b89501be-kube-api-access-js95g\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863920 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-slash\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863927 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-var-lib-openvswitch\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863921 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-kubelet\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863962 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-netns\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863974 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-slash\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.863992 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-config\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864095 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-netns\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864436 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864447 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864474 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.864519 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-config\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.866686 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62da04c-5422-4320-9352-8959b89501be-ovn-node-metrics-cert\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.880114 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-js95g\" (UniqueName: \"kubernetes.io/projected/d62da04c-5422-4320-9352-8959b89501be-kube-api-access-js95g\") pod \"ovnkube-node-z8p9x\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.963394 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.966265 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.966308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.966322 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.966341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:39 crc kubenswrapper[4772]: I0320 10:56:39.966353 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:39Z","lastTransitionTime":"2026-03-20T10:56:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:39 crc kubenswrapper[4772]: W0320 10:56:39.974914 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd62da04c_5422_4320_9352_8959b89501be.slice/crio-b510773ae06d1b3a3b1821abb61a4eb0f7748a9df1b3b32be318e47589d9cc9d WatchSource:0}: Error finding container b510773ae06d1b3a3b1821abb61a4eb0f7748a9df1b3b32be318e47589d9cc9d: Status 404 returned error can't find the container with id b510773ae06d1b3a3b1821abb61a4eb0f7748a9df1b3b32be318e47589d9cc9d Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.050449 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerStarted","Data":"de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.050546 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerStarted","Data":"07fa1adbc89d0d3ff9345bb70eef0dcb8cabbcf0684491e62f5ebe82286be361"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.051932 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.051973 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.051983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"77356b95c6a559131accb4351a915678a46449ab22c83daaa633c5c90ac6242b"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.053475 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4qd4" event={"ID":"f7e1e7e8-3b09-4e05-a31d-a74713a885f3","Type":"ContainerStarted","Data":"d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.053500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-k4qd4" event={"ID":"f7e1e7e8-3b09-4e05-a31d-a74713a885f3","Type":"ContainerStarted","Data":"692050369f482da73c5a47b6768f7df2bc2a7abfc8ba6040748bdcabd810d130"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.056612 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.056767 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.056983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"b987908a42c18b9f1aa825e1b18fb8c64deb6e51a9812c775f2baee46463fb0a"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.058427 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.058455 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"8f5d7498bea8ad4b2c46b0ff4dd358e77088a8f1d10251e20e8f907e368aee59"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.060757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"a94f9ff87a3335ab35e2f9534ac91beec26d3d7fbc63812801ca72473ccfe32a"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.062003 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"b510773ae06d1b3a3b1821abb61a4eb0f7748a9df1b3b32be318e47589d9cc9d"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.064772 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.065370 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e02a490-1cd4-40f2-baeb-f04ce5317e4d" containerID="4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e" exitCode=0 Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.065411 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerDied","Data":"4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.065564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerStarted","Data":"85db403002d9e446cd514e60dbb0756f499c10ada49b16eba7e5cd94d3c07868"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.074481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.074544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.074558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.074577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.074588 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.080491 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.092401 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.105206 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.116572 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.133830 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.144271 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.156931 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.172515 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.176312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.176339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.176347 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.176362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.176370 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.184691 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.202548 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.221141 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.233368 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.246107 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.257730 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.269575 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.279353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.279402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.279414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.279432 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.279446 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.281602 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.291931 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.308014 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.320321 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.334267 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.346061 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.381472 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.381509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.381517 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.381533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.381542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.484440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.484474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.484488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.484508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.484523 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.570593 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.570791 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.570766423 +0000 UTC m=+88.661732908 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.570960 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.570984 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.571002 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571089 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571120 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571140 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.571127653 +0000 UTC m=+88.662094138 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571169 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.571160494 +0000 UTC m=+88.662126979 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571182 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571213 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571224 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.571281 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.571266436 +0000 UTC m=+88.662232921 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.588181 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.588224 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.588234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.588247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.588258 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.640970 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.640989 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.641177 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.641252 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.641894 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.642025 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.644448 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.645116 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.645795 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.646414 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.646994 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.647489 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.649092 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.649701 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.650718 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.651385 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.651954 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.653012 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.653544 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.654386 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.654989 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.655945 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.656544 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.656959 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.658014 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.658599 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.659162 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.660272 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.661199 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.662315 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.662780 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.663761 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.664531 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.665425 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.666080 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.666955 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.667396 4772 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.667491 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.669178 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.670111 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.670494 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.671608 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.671767 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.671794 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.671819 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:40 crc kubenswrapper[4772]: E0320 10:56:40.671934 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:42.671916561 +0000 UTC m=+88.762883046 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.671974 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.673068 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.673630 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.674643 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.675285 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.676108 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.676665 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.677680 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.678277 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.679112 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.679724 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.680629 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.681621 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.682307 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.683339 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.684068 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.684546 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.685534 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.686024 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.695233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.695261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.695274 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.695289 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.695301 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.797743 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.798113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.798129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.798146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.798158 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.900166 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.900196 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.900204 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.900217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:40 crc kubenswrapper[4772]: I0320 10:56:40.900245 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:40Z","lastTransitionTime":"2026-03-20T10:56:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.003362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.003408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.003419 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.003438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.003450 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.070339 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d" exitCode=0 Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.070425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.081534 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e02a490-1cd4-40f2-baeb-f04ce5317e4d" containerID="00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093" exitCode=0 Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.081635 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerDied","Data":"00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.095792 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.109615 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.109948 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.109959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.109974 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.109985 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.113416 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.126904 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.142701 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.154097 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.173925 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.183285 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.195456 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.206883 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.214487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.214521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.214532 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.214547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.214559 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.217578 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.241051 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.254950 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.266938 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.277672 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.297481 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.310045 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.316823 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.316873 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.316885 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.316901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.316910 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.325475 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.343760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.354159 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.366571 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.376075 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.393043 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:41Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.419602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.419643 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.419652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.419669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.419679 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.521850 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.521892 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.521904 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.521925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.521936 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.624273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.624319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.624330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.624345 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.624354 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.642222 4772 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.726675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.726742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.726765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.726789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.726807 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.829471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.829509 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.829521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.829537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.829548 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.932330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.932381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.932397 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.932417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:41 crc kubenswrapper[4772]: I0320 10:56:41.932432 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:41Z","lastTransitionTime":"2026-03-20T10:56:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.035358 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.035408 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.035421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.035438 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.035449 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.088437 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.088652 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.088667 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.088678 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.088687 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.088696 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.091292 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e02a490-1cd4-40f2-baeb-f04ce5317e4d" containerID="b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da" exitCode=0 Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.091326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerDied","Data":"b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.120941 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.136141 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.139257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.139304 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.139320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.139340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.139356 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.148783 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.160898 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.175364 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.193320 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.205652 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.218440 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.233459 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.241304 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.241336 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.241346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.241359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.241368 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.245276 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.262258 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:42Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.343624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.343663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.343672 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.343687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.343695 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.447457 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.447505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.447521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.447544 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.447560 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.550755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.550784 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.550793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.550806 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.550818 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.590666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.590784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.590807 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.590825 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.590912 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.590956 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:46.590944218 +0000 UTC m=+92.681910703 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591236 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:46.591226716 +0000 UTC m=+92.682193201 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591303 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591325 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:46.591319888 +0000 UTC m=+92.682286373 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591369 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591380 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591389 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.591408 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:46.59140293 +0000 UTC m=+92.682369415 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.641850 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.641954 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.642145 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.642271 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.645086 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.645183 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.652376 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.652414 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.652426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.652441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.652453 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.691674 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.691849 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.691880 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.691892 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:42 crc kubenswrapper[4772]: E0320 10:56:42.691948 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:46.691931662 +0000 UTC m=+92.782898147 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.754627 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.754668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.754679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.754692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.754701 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.857689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.857730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.857743 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.857759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.857770 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.960487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.960536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.960548 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.960567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:42 crc kubenswrapper[4772]: I0320 10:56:42.960579 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:42Z","lastTransitionTime":"2026-03-20T10:56:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.063663 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.063718 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.063735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.063756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.063773 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.098747 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e02a490-1cd4-40f2-baeb-f04ce5317e4d" containerID="ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795" exitCode=0 Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.098821 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerDied","Data":"ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.101366 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.123740 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.140516 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.153427 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.166614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.166649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.166660 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.166675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.166686 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.169445 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.189643 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.203739 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.225346 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.240958 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.255856 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.270080 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.270116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.270162 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.270180 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.270208 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.270227 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.297209 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.315175 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.333999 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.346999 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.359355 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.369779 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.373298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.373336 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.373346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.373363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.373374 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.381538 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.391884 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.406300 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.419919 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.429096 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.451057 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:43Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.475817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.475878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.475891 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.475907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.475918 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.578528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.578596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.578619 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.578649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.578676 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.657207 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.680514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.680910 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.680929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.680953 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.680971 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.784094 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.784135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.784146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.784163 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.784176 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.886858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.886890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.886903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.886917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.886926 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.988905 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.988942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.988951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.988965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:43 crc kubenswrapper[4772]: I0320 10:56:43.988974 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:43Z","lastTransitionTime":"2026-03-20T10:56:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.093039 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.093106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.093124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.093149 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.093166 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.109358 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e02a490-1cd4-40f2-baeb-f04ce5317e4d" containerID="28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9" exitCode=0 Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.109433 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerDied","Data":"28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.116077 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.125010 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.144412 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.162975 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.195664 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.196087 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.196161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.196182 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.196217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.196242 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.216418 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.234898 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.266743 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.279560 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.295263 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.298611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.298653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.298669 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.298687 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.298702 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.315766 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.335567 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.351338 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.402941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.402989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.403003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.403021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.403034 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.507764 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.507829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.507876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.507900 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.507916 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.610290 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.610332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.610348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.610369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.610387 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.644135 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:44 crc kubenswrapper[4772]: E0320 10:56:44.644408 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.646305 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.647676 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:44 crc kubenswrapper[4772]: E0320 10:56:44.647677 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:44 crc kubenswrapper[4772]: E0320 10:56:44.647805 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.664716 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.684676 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.702749 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.712744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.712796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.712814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.712835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.712884 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.716738 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.732213 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.749776 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.772833 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.787499 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.809745 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.815315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.815363 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.815380 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.815404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.815421 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.826649 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.841813 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.854135 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.918217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.918251 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.918262 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.918281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:44 crc kubenswrapper[4772]: I0320 10:56:44.918294 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:44Z","lastTransitionTime":"2026-03-20T10:56:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.021089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.021127 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.021145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.021166 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.021182 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.123415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.123718 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.123743 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.123773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.123792 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.130101 4772 generic.go:334] "Generic (PLEG): container finished" podID="8e02a490-1cd4-40f2-baeb-f04ce5317e4d" containerID="615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17" exitCode=0 Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.130161 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerDied","Data":"615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.167010 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.187815 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.210267 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.224264 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.230216 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.230279 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.230292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.230309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.230343 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.244123 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.265228 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.281646 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.298275 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.316420 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.331644 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.332308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.332338 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.332346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.332360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.332369 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.345807 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.362075 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.434253 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.434292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.434300 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.434317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.434326 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.537593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.537625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.537635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.537649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.537661 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.544349 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-95tl8"] Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.544718 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.547098 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.547180 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.547189 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.547119 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.569809 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.586513 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.599048 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.615375 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.629468 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.639714 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.639771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.639794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.639823 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.639875 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.646913 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.660024 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.682044 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.694933 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.710480 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.725803 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.727427 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfpf\" (UniqueName: \"kubernetes.io/projected/9de5f9ae-372d-4c5f-89ec-93a96431485b-kube-api-access-bvfpf\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.727575 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9de5f9ae-372d-4c5f-89ec-93a96431485b-host\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.727619 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9de5f9ae-372d-4c5f-89ec-93a96431485b-serviceca\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.740108 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.742593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.742641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.742654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.742675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.742687 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.768253 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.828710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfpf\" (UniqueName: \"kubernetes.io/projected/9de5f9ae-372d-4c5f-89ec-93a96431485b-kube-api-access-bvfpf\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.828782 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9de5f9ae-372d-4c5f-89ec-93a96431485b-host\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.828820 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9de5f9ae-372d-4c5f-89ec-93a96431485b-serviceca\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.828945 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9de5f9ae-372d-4c5f-89ec-93a96431485b-host\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.830593 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/9de5f9ae-372d-4c5f-89ec-93a96431485b-serviceca\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.832717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.832772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.832801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.832829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.832868 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: E0320 10:56:45.850052 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.853307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.853355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.853372 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.853394 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.853411 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.855070 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfpf\" (UniqueName: \"kubernetes.io/projected/9de5f9ae-372d-4c5f-89ec-93a96431485b-kube-api-access-bvfpf\") pod \"node-ca-95tl8\" (UID: \"9de5f9ae-372d-4c5f-89ec-93a96431485b\") " pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.864471 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-95tl8" Mar 20 10:56:45 crc kubenswrapper[4772]: E0320 10:56:45.867687 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.872213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.872246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.872255 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.872272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.872283 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: W0320 10:56:45.878127 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9de5f9ae_372d_4c5f_89ec_93a96431485b.slice/crio-9757a9f4694a6262586b004ebbe9865ada9f4586fa1227f67c9d65a51ad802bd WatchSource:0}: Error finding container 9757a9f4694a6262586b004ebbe9865ada9f4586fa1227f67c9d65a51ad802bd: Status 404 returned error can't find the container with id 9757a9f4694a6262586b004ebbe9865ada9f4586fa1227f67c9d65a51ad802bd Mar 20 10:56:45 crc kubenswrapper[4772]: E0320 10:56:45.884265 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.887721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.887771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.887788 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.887808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.887829 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: E0320 10:56:45.901230 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.906317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.906360 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.906371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.906386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.906397 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:45 crc kubenswrapper[4772]: E0320 10:56:45.919025 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:45Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:45 crc kubenswrapper[4772]: E0320 10:56:45.919189 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.921537 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.921567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.921580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.921609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:45 crc kubenswrapper[4772]: I0320 10:56:45.921624 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:45Z","lastTransitionTime":"2026-03-20T10:56:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.024213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.024245 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.024255 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.024272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.024308 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.126110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.126455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.126465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.126482 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.126492 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.142588 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.142829 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.142883 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.142904 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.147776 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" event={"ID":"8e02a490-1cd4-40f2-baeb-f04ce5317e4d","Type":"ContainerStarted","Data":"c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.149235 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-95tl8" event={"ID":"9de5f9ae-372d-4c5f-89ec-93a96431485b","Type":"ContainerStarted","Data":"8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.149271 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-95tl8" event={"ID":"9de5f9ae-372d-4c5f-89ec-93a96431485b","Type":"ContainerStarted","Data":"9757a9f4694a6262586b004ebbe9865ada9f4586fa1227f67c9d65a51ad802bd"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.160570 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.199750 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.202298 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.205294 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.209876 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.224242 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.228800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.228828 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.228858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.228875 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.228885 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.238540 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.249486 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.260086 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.276526 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.294639 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.317952 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.332105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.332172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.332191 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.332218 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.332237 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.334989 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.349934 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.369345 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.390672 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.407399 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.420974 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.435922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.435971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.435988 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.436010 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.436028 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.439791 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.464554 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.478462 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.492965 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.505623 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.523129 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.538553 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.538598 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.538611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.538631 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.538643 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.541467 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.553765 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.566547 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.580455 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:46Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.635711 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.635863 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:56:54.635831023 +0000 UTC m=+100.726797498 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.635980 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.636038 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.636091 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636108 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636140 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:54.636133651 +0000 UTC m=+100.727100136 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636248 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636273 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636308 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636331 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636341 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:54.636318856 +0000 UTC m=+100.727285351 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.636396 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:54.636375028 +0000 UTC m=+100.727341583 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.641800 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.641901 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.641829 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.642039 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.642179 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.642328 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.644464 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.644494 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.644508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.644524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.644537 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.737162 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.737322 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.737345 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.737356 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:46 crc kubenswrapper[4772]: E0320 10:56:46.737407 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:54.737392933 +0000 UTC m=+100.828359418 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.746860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.746903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.746913 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.746928 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.746939 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.849269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.849321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.849333 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.849353 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.849398 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.952798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.952914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.952943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.952971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:46 crc kubenswrapper[4772]: I0320 10:56:46.953027 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:46Z","lastTransitionTime":"2026-03-20T10:56:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.056012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.056060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.056072 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.056090 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.056103 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.158154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.158188 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.158228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.158242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.158251 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.261089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.261140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.261157 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.261180 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.261199 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.363474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.363524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.363539 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.363558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.363572 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.477064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.477106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.477118 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.477134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.477149 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.579198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.579241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.579252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.579269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.579280 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.681916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.681970 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.681989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.682013 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.682030 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.785171 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.785242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.785263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.785294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.785316 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.889473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.889582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.889604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.889631 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.889650 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.992261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.992319 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.992332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.992350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:47 crc kubenswrapper[4772]: I0320 10:56:47.992362 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:47Z","lastTransitionTime":"2026-03-20T10:56:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.096006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.096057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.096079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.096109 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.096132 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.199148 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.199193 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.199204 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.199219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.199232 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.301467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.301508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.301518 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.301534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.301544 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.404228 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.404273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.404284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.404305 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.404319 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.506796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.507111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.507178 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.507241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.507306 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.609977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.610014 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.610022 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.610036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.610045 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.641709 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.641749 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:48 crc kubenswrapper[4772]: E0320 10:56:48.641819 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:48 crc kubenswrapper[4772]: E0320 10:56:48.641941 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.641710 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:48 crc kubenswrapper[4772]: E0320 10:56:48.642016 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.712708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.712732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.712742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.712755 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.712765 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.815143 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.815184 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.815195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.815213 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.815225 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.919574 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.919607 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.919615 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.919630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:48 crc kubenswrapper[4772]: I0320 10:56:48.919644 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:48Z","lastTransitionTime":"2026-03-20T10:56:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.023184 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.023222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.023230 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.023246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.023256 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.126570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.126661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.126690 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.126722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.126748 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.161394 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/0.log" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.165774 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07" exitCode=1 Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.165885 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.167150 4772 scope.go:117] "RemoveContainer" containerID="84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.186413 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.202440 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.219973 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.229597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.229640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.229658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.229673 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.229685 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.241311 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.255423 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.274057 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.292796 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.310138 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.333425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.333474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.333487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.333505 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.333517 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.344735 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:48Z\\\",\\\"message\\\":\\\"lient-go/informers/factory.go:160\\\\nI0320 10:56:48.857442 6539 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:56:48.857508 6539 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.857665 6539 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.858187 6539 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:48.858165 6539 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:48.858226 6539 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:48.858250 6539 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:48.858302 6539 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:48.858403 6539 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:48.858412 6539 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:48.858423 6539 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:48.858455 6539 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:48.858489 6539 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.367906 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.382296 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.401925 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.415719 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:49Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.436221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.436260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.436270 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.436285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.436297 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.539016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.539096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.539115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.539549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.539608 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.642612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.642649 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.642659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.642706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.642719 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.744789 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.744822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.744830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.744855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.744864 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.847115 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.847161 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.847174 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.847191 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.847200 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.949542 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.949591 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.949602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.949620 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:49 crc kubenswrapper[4772]: I0320 10:56:49.949633 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:49Z","lastTransitionTime":"2026-03-20T10:56:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.051452 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.051528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.051549 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.051577 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.051597 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.153957 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.153999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.154009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.154024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.154035 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.170505 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/0.log" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.173330 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.173824 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.205189 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:48Z\\\",\\\"message\\\":\\\"lient-go/informers/factory.go:160\\\\nI0320 10:56:48.857442 6539 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:56:48.857508 6539 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.857665 6539 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.858187 6539 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:48.858165 6539 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:48.858226 6539 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:48.858250 6539 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:48.858302 6539 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:48.858403 6539 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:48.858412 6539 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:48.858423 6539 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:48.858455 6539 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:48.858489 6539 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.233557 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.256879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.256918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.256927 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.256942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.256955 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.257565 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.295772 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.308573 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.320866 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.333206 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.344774 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.358579 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.359114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.359172 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.359182 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.359202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.359213 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.369974 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.383152 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.393966 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.406350 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:50Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.461500 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.461543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.461552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.461567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.461582 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.563590 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.563630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.563641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.563658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.563669 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.641589 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:50 crc kubenswrapper[4772]: E0320 10:56:50.641740 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.641989 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.642246 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:50 crc kubenswrapper[4772]: E0320 10:56:50.642407 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:50 crc kubenswrapper[4772]: E0320 10:56:50.642252 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.666510 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.666800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.666959 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.667054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.667158 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.770335 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.770377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.770389 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.770406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.770416 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.873066 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.873121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.873140 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.873162 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.873179 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.975612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.975645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.975654 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.975667 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:50 crc kubenswrapper[4772]: I0320 10:56:50.975676 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:50Z","lastTransitionTime":"2026-03-20T10:56:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.078766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.078817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.078826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.078858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.078871 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.181533 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/1.log" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.182173 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.182199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.182207 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.182221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.182230 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.182797 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/0.log" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.188670 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2" exitCode=1 Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.188703 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.188731 4772 scope.go:117] "RemoveContainer" containerID="84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.191642 4772 scope.go:117] "RemoveContainer" containerID="b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2" Mar 20 10:56:51 crc kubenswrapper[4772]: E0320 10:56:51.192003 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.208555 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.227620 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.241398 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.252065 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.271497 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:48Z\\\",\\\"message\\\":\\\"lient-go/informers/factory.go:160\\\\nI0320 10:56:48.857442 6539 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:56:48.857508 6539 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.857665 6539 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.858187 6539 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:48.858165 6539 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:48.858226 6539 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:48.858250 6539 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:48.858302 6539 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:48.858403 6539 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:48.858412 6539 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:48.858423 6539 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:48.858455 6539 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:48.858489 6539 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.284436 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.284473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.284484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.284522 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.284534 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.293930 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.307617 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.320991 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.335022 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.350251 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.363779 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.375709 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.388604 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.388630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.388641 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.388656 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.388666 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.392278 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.491243 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.491283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.491292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.491308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.491316 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.528481 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb"] Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.529220 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.532258 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.532463 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.561646 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.576672 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.592148 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.593435 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.593475 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.593489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.593507 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.593519 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.604186 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.620352 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.620970 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x4qpq\" (UniqueName: \"kubernetes.io/projected/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-kube-api-access-x4qpq\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.621028 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.621069 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.621103 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.635287 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.645871 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.651317 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.651669 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:56:51 crc kubenswrapper[4772]: E0320 10:56:51.651871 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.667524 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.679798 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.693972 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.695355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.695478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.695582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.695706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.695823 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.711151 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.721362 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.721734 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x4qpq\" (UniqueName: \"kubernetes.io/projected/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-kube-api-access-x4qpq\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.721791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.721880 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.721937 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.722918 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.722995 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-env-overrides\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.727112 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.733146 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.739174 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x4qpq\" (UniqueName: \"kubernetes.io/projected/9742872d-abdb-4fdc-a4d2-48d04fa61dbf-kube-api-access-x4qpq\") pod \"ovnkube-control-plane-749d76644c-kzxjb\" (UID: \"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.750160 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://84224e4f2f3cbc8715c5d71bf88b4e58c8b7c64830d85a6e80b8fafbfa55be07\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:48Z\\\",\\\"message\\\":\\\"lient-go/informers/factory.go:160\\\\nI0320 10:56:48.857442 6539 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:56:48.857508 6539 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.857665 6539 reflector.go:311] Stopping reflector *v1.EndpointSlice (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:56:48.858187 6539 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:56:48.858165 6539 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:48.858226 6539 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:48.858250 6539 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:48.858302 6539 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:56:48.858403 6539 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10:56:48.858412 6539 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:48.858423 6539 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:48.858455 6539 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:48.858489 6539 factory.go:656] Stopping \\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:51Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.798712 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.798748 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.798756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.798776 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.798786 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.849632 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" Mar 20 10:56:51 crc kubenswrapper[4772]: W0320 10:56:51.863888 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9742872d_abdb_4fdc_a4d2_48d04fa61dbf.slice/crio-2659d5fb9c41f30a93a559778bd6f10d111bdd0d5a80b82a19722876c4637c82 WatchSource:0}: Error finding container 2659d5fb9c41f30a93a559778bd6f10d111bdd0d5a80b82a19722876c4637c82: Status 404 returned error can't find the container with id 2659d5fb9c41f30a93a559778bd6f10d111bdd0d5a80b82a19722876c4637c82 Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.902124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.902527 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.902575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.902606 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:51 crc kubenswrapper[4772]: I0320 10:56:51.902630 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:51Z","lastTransitionTime":"2026-03-20T10:56:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.006036 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.006073 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.006082 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.006096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.006105 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.108524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.108563 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.108574 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.108589 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.108599 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.194983 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/1.log" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.200336 4772 scope.go:117] "RemoveContainer" containerID="b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.200702 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.206252 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" event={"ID":"9742872d-abdb-4fdc-a4d2-48d04fa61dbf","Type":"ContainerStarted","Data":"a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.206317 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" event={"ID":"9742872d-abdb-4fdc-a4d2-48d04fa61dbf","Type":"ContainerStarted","Data":"2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.206343 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" event={"ID":"9742872d-abdb-4fdc-a4d2-48d04fa61dbf","Type":"ContainerStarted","Data":"2659d5fb9c41f30a93a559778bd6f10d111bdd0d5a80b82a19722876c4637c82"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.210383 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.210540 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-apiserver-check-endpoints\" with CrashLoopBackOff: \"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\"" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.211525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.211560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.211569 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.211581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.211592 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.222030 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.241765 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.258890 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.280711 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.280918 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-m8kjd"] Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.281419 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.281470 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.293778 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.306421 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.313131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.313156 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.313164 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.313176 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.313185 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.320992 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.327632 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.327695 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/2ac5550b-02eb-48b4-b62a-e21dd4429249-kube-api-access-7srzh\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.340617 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.350905 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.362472 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.373345 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.382724 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.392368 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.403760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.412960 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.415635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.415678 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.415692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.415710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.415723 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.428108 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.428757 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/2ac5550b-02eb-48b4-b62a-e21dd4429249-kube-api-access-7srzh\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.428962 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.429056 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.429112 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:52.929097497 +0000 UTC m=+99.020063982 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.447533 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.450100 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7srzh\" (UniqueName: \"kubernetes.io/projected/2ac5550b-02eb-48b4-b62a-e21dd4429249-kube-api-access-7srzh\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.467357 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.478567 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.490185 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.502041 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.519007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.519054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.519069 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.519089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.519104 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.519255 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.531987 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.545367 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.554066 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.567598 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.577588 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.586092 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.596303 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.604867 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.611992 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:52Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.621658 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.621695 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.621703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.621742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.621753 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.641675 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.641721 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.641770 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.641808 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.641944 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.642050 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.723561 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.723593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.723602 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.723614 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.723622 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.825566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.825605 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.825616 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.825632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.825643 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.928011 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.928072 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.928084 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.928123 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.928138 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:52Z","lastTransitionTime":"2026-03-20T10:56:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:52 crc kubenswrapper[4772]: I0320 10:56:52.933459 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.933657 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:52 crc kubenswrapper[4772]: E0320 10:56:52.933769 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:53.933728819 +0000 UTC m=+100.024695324 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.030269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.030315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.030350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.030368 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.030380 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.133097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.133167 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.133179 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.133217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.133230 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.236260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.236997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.237025 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.237057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.237078 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.340685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.340757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.340771 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.340794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.340808 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.443719 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.444371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.444399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.444424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.444440 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.548286 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.548332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.548347 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.548369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.548385 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.650812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.650888 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.650901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.650917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.650930 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.753268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.753317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.753367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.753386 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.753399 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.856134 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.856178 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.856194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.856215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.856241 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.944545 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:53 crc kubenswrapper[4772]: E0320 10:56:53.944767 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:53 crc kubenswrapper[4772]: E0320 10:56:53.944885 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:55.944817915 +0000 UTC m=+102.035784410 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.958812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.958904 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.958923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.958955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:53 crc kubenswrapper[4772]: I0320 10:56:53.958973 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:53Z","lastTransitionTime":"2026-03-20T10:56:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.061029 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.061080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.061091 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.061111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.061125 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.164167 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.164222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.164241 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.164264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.164284 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.266756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.266800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.266830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.266867 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.266879 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.368879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.368925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.368936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.368951 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.368964 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.475980 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.476028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.476040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.476059 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.476072 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.579565 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.579939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.580130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.580273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.580425 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.641803 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.641963 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.642233 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.642320 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.642509 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.642601 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.642818 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.642974 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.653458 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.653605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.653653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653711 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.653682167 +0000 UTC m=+116.744648682 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653758 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653827 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.65380506 +0000 UTC m=+116.744771585 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653926 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653950 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653973 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.653757 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.654020 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.654006486 +0000 UTC m=+116.744973011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.653955 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.654080 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.654068478 +0000 UTC m=+116.745035003 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.658073 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.673624 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.683468 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.683536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.683558 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.683582 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.683602 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.685415 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.699170 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.730955 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.746699 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.754702 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.755053 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.755110 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.755132 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:54 crc kubenswrapper[4772]: E0320 10:56:54.755222 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:10.755196566 +0000 UTC m=+116.846163091 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.763300 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.775990 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.785578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.785635 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.785664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.785696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.785714 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.796284 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.809281 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.822134 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.833886 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.847135 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.858746 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.870308 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.885018 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:54Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.887015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.887037 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.887049 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.887065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.887077 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.989318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.989381 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.989402 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.989433 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:54 crc kubenswrapper[4772]: I0320 10:56:54.989455 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:54Z","lastTransitionTime":"2026-03-20T10:56:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.092497 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.092881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.093040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.093427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.093578 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.197224 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.197273 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.197288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.197308 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.197324 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.300154 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.300206 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.300221 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.300238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.300250 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.402621 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.402659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.402668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.402681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.402690 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.506136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.506195 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.506214 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.506238 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.506255 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.608192 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.608235 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.608247 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.608263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.608281 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.711313 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.711350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.711361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.711375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.711386 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.813406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.813472 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.813496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.813524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.813548 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.915782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.915817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.915826 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.915864 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.915874 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:55Z","lastTransitionTime":"2026-03-20T10:56:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:55 crc kubenswrapper[4772]: I0320 10:56:55.966189 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:55 crc kubenswrapper[4772]: E0320 10:56:55.966324 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:55 crc kubenswrapper[4772]: E0320 10:56:55.966401 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:56:59.966378918 +0000 UTC m=+106.057345423 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.018395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.018439 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.018453 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.018471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.018485 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.120639 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.120681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.120692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.120708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.120720 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.222581 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.222617 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.222630 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.222647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.222660 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.252517 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.252572 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.252586 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.252609 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.252623 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.266131 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:56Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.269557 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.269598 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.269611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.269628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.269639 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.282889 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:56Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.286651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.286693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.286707 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.286724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.286737 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.300429 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:56Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.304474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.304524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.304540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.304561 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.304576 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.320210 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:56Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.323800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.323834 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.323866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.323883 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.323894 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.337168 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:56Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:56:56Z is after 2025-08-24T17:21:41Z" Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.337324 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.338761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.338802 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.338812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.338829 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.338865 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.442021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.442069 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.442080 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.442096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.442108 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.543899 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.544170 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.544252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.544378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.544463 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.641240 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.641259 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.641262 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.641322 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.641449 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.641622 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.641730 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:56 crc kubenswrapper[4772]: E0320 10:56:56.641815 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.646278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.646346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.646377 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.646398 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.646411 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.749278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.749325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.749341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.749361 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.749373 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.851812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.851856 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.851866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.851879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.851889 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.954570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.954623 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.954640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.954662 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:56 crc kubenswrapper[4772]: I0320 10:56:56.954678 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:56Z","lastTransitionTime":"2026-03-20T10:56:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.056307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.056362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.056374 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.056392 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.056403 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.158613 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.158882 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.158986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.159097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.159211 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.262088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.262317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.262430 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.262559 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.262675 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.365395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.365470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.365493 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.365525 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.365551 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.468317 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.468369 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.468385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.468406 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.468421 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.570967 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.571346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.571523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.571672 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.571796 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.675141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.675187 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.675205 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.675225 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.675241 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.778473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.778531 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.778547 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.778566 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.778581 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.882179 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.882234 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.882252 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.882278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.882296 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.985685 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.985811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.985833 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.985919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:57 crc kubenswrapper[4772]: I0320 10:56:57.985941 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:57Z","lastTransitionTime":"2026-03-20T10:56:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.088120 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.088398 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.088533 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.088666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.088800 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.192062 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.192471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.192624 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.192781 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.192959 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.295805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.295898 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.295916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.295939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.295956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.398665 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.398779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.398822 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.398925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.398956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.503311 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.503354 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.503365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.503385 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.503398 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.606473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.606543 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.606564 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.606595 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.606617 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.641316 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.641454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:56:58 crc kubenswrapper[4772]: E0320 10:56:58.641524 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.641637 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:56:58 crc kubenswrapper[4772]: E0320 10:56:58.641727 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.641692 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:56:58 crc kubenswrapper[4772]: E0320 10:56:58.642008 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:56:58 crc kubenswrapper[4772]: E0320 10:56:58.642389 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.711516 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.711596 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.711611 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.711634 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.711670 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.814936 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.814986 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.815005 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.815031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.815050 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.918724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.918804 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.918831 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.918897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:58 crc kubenswrapper[4772]: I0320 10:56:58.918923 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:58Z","lastTransitionTime":"2026-03-20T10:56:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.022998 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.023051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.023064 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.023085 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.023098 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.125778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.125893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.125917 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.125952 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.125979 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.229202 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.229263 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.229284 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.229312 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.229331 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.333342 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.333421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.333440 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.333470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.333491 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.437144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.437209 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.437227 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.437257 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.437279 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.539996 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.540057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.540075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.540100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.540119 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.644373 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.644478 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.644504 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.644536 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.644559 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.748911 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.748997 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.749019 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.749049 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.749071 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.852958 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.853471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.853903 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.854123 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.854335 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.957801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.958287 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.958417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.958541 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:56:59 crc kubenswrapper[4772]: I0320 10:56:59.958731 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:56:59Z","lastTransitionTime":"2026-03-20T10:56:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.010935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:00 crc kubenswrapper[4772]: E0320 10:57:00.011201 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:00 crc kubenswrapper[4772]: E0320 10:57:00.011339 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:08.011308345 +0000 UTC m=+114.102274910 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.063679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.063791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.063819 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.063907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.063937 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.168089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.168168 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.168189 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.168231 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.168255 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.271467 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.271872 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.272040 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.272268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.272440 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.375567 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.375747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.375779 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.375811 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.375910 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.478929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.479000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.479018 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.479042 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.479061 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.582817 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.583321 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.583521 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.583722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.583896 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.642078 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.642156 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.642102 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.642110 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:00 crc kubenswrapper[4772]: E0320 10:57:00.642352 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:00 crc kubenswrapper[4772]: E0320 10:57:00.642541 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:00 crc kubenswrapper[4772]: E0320 10:57:00.642724 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:00 crc kubenswrapper[4772]: E0320 10:57:00.642815 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.687722 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.687770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.687783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.687805 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.687819 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.790879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.790938 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.790960 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.790984 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.791000 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.894625 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.894679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.894698 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.894725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.894772 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.998144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.998219 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.998240 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.998271 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:00 crc kubenswrapper[4772]: I0320 10:57:00.998290 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:00Z","lastTransitionTime":"2026-03-20T10:57:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.101791 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.101914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.101937 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.101965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.101984 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.204920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.205026 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.205047 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.205079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.205105 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.308068 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.308124 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.308136 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.308158 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.308172 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.412318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.412426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.412451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.412479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.412498 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.516294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.516371 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.516395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.516422 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.516445 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.620534 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.620636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.620652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.620671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.620685 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.724346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.724441 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.724459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.724481 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.724498 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.828599 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.829129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.829281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.829426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.829551 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.933679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.933725 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.933737 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.933754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:01 crc kubenswrapper[4772]: I0320 10:57:01.933767 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:01Z","lastTransitionTime":"2026-03-20T10:57:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.036679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.036747 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.036768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.036801 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.036823 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.140576 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.140640 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.140657 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.140681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.140698 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.244573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.244653 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.244676 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.244706 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.244727 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.348563 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.348626 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.348646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.348675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.348694 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.452484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.452575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.452601 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.452637 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.452664 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.556016 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.556096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.556116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.556146 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.556168 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.641786 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.641938 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:02 crc kubenswrapper[4772]: E0320 10:57:02.642001 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.642038 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.642081 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:02 crc kubenswrapper[4772]: E0320 10:57:02.642204 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:02 crc kubenswrapper[4772]: E0320 10:57:02.642295 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:02 crc kubenswrapper[4772]: E0320 10:57:02.642364 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.658808 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.658969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.659008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.659043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.659142 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.761939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.762018 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.762032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.762053 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.762069 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.868943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.869012 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.869034 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.869060 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.869081 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.973348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.973425 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.973450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.973489 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:02 crc kubenswrapper[4772]: I0320 10:57:02.973514 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:02Z","lastTransitionTime":"2026-03-20T10:57:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.077881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.077989 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.078015 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.078052 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.078074 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.182727 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.182818 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.182889 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.182929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.182958 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.287362 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.287430 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.287450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.287479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.287500 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.391194 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.391261 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.391281 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.391310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.391336 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.495511 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.495632 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.495652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.495674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.495694 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.599400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.599455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.599469 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.599490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.599502 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.703806 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.703923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.703943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.703973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.703995 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.807800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.807907 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.807934 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.807965 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.807988 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.911593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.911675 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.911703 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.911736 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:03 crc kubenswrapper[4772]: I0320 10:57:03.911757 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:03Z","lastTransitionTime":"2026-03-20T10:57:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.014551 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.014636 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.014655 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.014692 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.014718 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.118339 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.118437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.118472 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.118514 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.118542 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.222646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.222705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.222724 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.222795 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.222824 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.326130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.326199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.326217 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.326242 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.326262 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.430223 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.430890 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.430906 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.430929 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.430946 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.535000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.535056 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.535073 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.535095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.535114 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.638320 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.638379 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.638399 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.638421 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.638438 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.640810 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.640864 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.640988 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:04 crc kubenswrapper[4772]: E0320 10:57:04.640945 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.641075 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:04 crc kubenswrapper[4772]: E0320 10:57:04.641089 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:04 crc kubenswrapper[4772]: E0320 10:57:04.641111 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:04 crc kubenswrapper[4772]: E0320 10:57:04.641362 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.669010 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=kube-apiserver-check-endpoints pod=kube-apiserver-crc_openshift-kube-apiserver(f4b27818a5e8e43d0dc095d08835c792)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.700056 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.732419 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.742357 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.742499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.742530 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.742573 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.742608 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.749628 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.770041 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.791940 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.811453 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.833353 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.846351 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.846395 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.846416 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.846446 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.846466 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.856265 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.874959 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.900037 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.918204 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.934061 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.950396 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.950463 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.950484 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.950513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.950535 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:04Z","lastTransitionTime":"2026-03-20T10:57:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.953034 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.971568 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:04 crc kubenswrapper[4772]: I0320 10:57:04.988747 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:04Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.096575 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.096651 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.096671 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.096702 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.096725 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.199973 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.200043 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.200065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.200092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.200112 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.304679 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.304759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.304781 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.304809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.304828 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.409184 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.409268 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.409295 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.409332 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.409361 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.513593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.513680 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.513705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.513735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.513759 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.617782 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.617864 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.617880 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.617900 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.617914 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.721666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.721726 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.721742 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.721765 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.721783 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.824869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.825100 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.825139 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.825175 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.825200 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.928694 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.928757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.928777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.928802 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:05 crc kubenswrapper[4772]: I0320 10:57:05.928820 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:05Z","lastTransitionTime":"2026-03-20T10:57:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.032638 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.032697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.032711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.032732 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.032744 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.136372 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.136451 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.136476 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.136508 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.136528 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.239916 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.240001 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.240024 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.240057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.240080 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.342800 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.343367 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.343540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.343680 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.343807 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.361089 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.361119 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.361131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.361144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.361155 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.392597 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.403199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.403260 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.403275 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.403301 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.403318 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.448537 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.454021 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.454215 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.454340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.454597 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.454740 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.473600 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.478325 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.478365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.478375 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.478391 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.478402 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.494956 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.499049 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.499083 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.499095 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.499110 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.499122 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.513820 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:06Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:06Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.513951 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.516348 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.516535 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.516652 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.516770 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.516904 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.620689 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.620760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.620774 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.620798 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.620812 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.642469 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.642567 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.642658 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.642476 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.643217 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.642663 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.643357 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.643699 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:06 crc kubenswrapper[4772]: E0320 10:57:06.644098 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.727003 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.727076 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.727092 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.727117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.727134 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.831027 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.831105 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.831126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.831155 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.831173 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.933900 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.934138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.934206 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.934294 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:06 crc kubenswrapper[4772]: I0320 10:57:06.934526 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:06Z","lastTransitionTime":"2026-03-20T10:57:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.038032 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.038097 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.038116 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.038145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.038162 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.142820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.142940 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.143000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.143028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.143083 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.247198 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.247267 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.247285 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.247309 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.247329 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.280321 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.283289 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.283685 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.317221 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.345192 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.358809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.358894 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.358920 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.358955 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.358980 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.364429 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.383091 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.399157 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.416652 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.434826 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.448656 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.461352 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.461390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.461400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.461417 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.461427 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.462398 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.483005 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.497242 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.517141 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.530219 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.542106 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.558236 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.563869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.563908 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.563923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.563943 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.563956 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.588043 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:07Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.642447 4772 scope.go:117] "RemoveContainer" containerID="b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.673568 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.673645 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.673666 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.673693 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.673711 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.776846 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.777766 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.777858 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.777939 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.778024 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.880876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.880910 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.880919 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.880963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.880978 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.984897 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.984978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.984999 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.985035 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:07 crc kubenswrapper[4772]: I0320 10:57:07.985058 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:07Z","lastTransitionTime":"2026-03-20T10:57:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.019040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:08 crc kubenswrapper[4772]: E0320 10:57:08.019390 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:08 crc kubenswrapper[4772]: E0320 10:57:08.019562 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:24.019524453 +0000 UTC m=+130.110490978 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.087794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.087855 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.087869 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.087886 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.087898 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.190340 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.190412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.190426 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.190450 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.190468 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.287412 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/1.log" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.289479 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.292792 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.292827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.292854 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.292868 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.292879 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.305661 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.327801 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.344022 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.358506 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.376520 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.394754 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.394785 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.394795 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.394809 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.394819 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.401690 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.413998 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.427250 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.441529 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.451465 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.462048 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.478374 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.493346 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.501299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.501350 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.501366 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.501384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.501402 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.511334 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.525180 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.541549 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:08Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.603876 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.603922 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.603940 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.603962 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.603978 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.640940 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:08 crc kubenswrapper[4772]: E0320 10:57:08.641099 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.641112 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.641190 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.641246 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:08 crc kubenswrapper[4772]: E0320 10:57:08.641212 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:08 crc kubenswrapper[4772]: E0320 10:57:08.641375 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:08 crc kubenswrapper[4772]: E0320 10:57:08.641609 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.706674 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.706709 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.706718 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.706730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.706740 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.809612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.809665 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.809681 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.809701 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.809722 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.912129 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.912200 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.912220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.912246 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:08 crc kubenswrapper[4772]: I0320 10:57:08.912264 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:08Z","lastTransitionTime":"2026-03-20T10:57:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.014877 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.014918 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.014931 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.014947 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.014958 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.117331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.117400 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.117418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.117445 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.117462 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.220708 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.220772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.220790 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.220815 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.220832 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.295996 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/2.log" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.297324 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/1.log" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.301640 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38" exitCode=1 Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.301721 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.301793 4772 scope.go:117] "RemoveContainer" containerID="b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.302915 4772 scope.go:117] "RemoveContainer" containerID="6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38" Mar 20 10:57:09 crc kubenswrapper[4772]: E0320 10:57:09.303210 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.325147 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.325711 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.325761 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.325780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.325807 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.325826 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.352494 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.368705 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.388476 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.409044 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.428243 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.430210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.430266 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.430283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.430306 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.430324 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.449956 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.470196 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.487697 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.510211 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.532780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.532914 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.532941 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.532969 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.532990 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.547047 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b4eb7669197cd9347749ef7b2a73560bbd6a769b040d72fbf30c35be8c7a9dd2\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:56:50Z\\\",\\\"message\\\":\\\"h-apiserver/api\\\\\\\"}\\\\nI0320 10:56:50.161482 6686 services_controller.go:360] Finished syncing service api on namespace openshift-oauth-apiserver for network=default : 1.281336ms\\\\nI0320 10:56:50.161559 6686 factory.go:1336] Added *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161606 6686 handler.go:190] Sending *v1.EgressFirewall event handler 9 for removal\\\\nI0320 10:56:50.161641 6686 controller.go:132] Adding controller ef_node_controller event handlers\\\\nI0320 10:56:50.161665 6686 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:56:50.161692 6686 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:56:50.161695 6686 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0320 10:56:50.161703 6686 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:56:50.161719 6686 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:56:50.161739 6686 factory.go:656] Stopping watch factory\\\\nI0320 10:56:50.161743 6686 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0320 10:56:50.161764 6686 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:56:50.161829 6686 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:56:50.161877 6686 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:56:50.161957 6686 ovnkube.go:\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:49Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.568966 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.586181 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.612650 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.632339 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.635051 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.635096 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.635113 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.635138 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.635156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.658598 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:09Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.737975 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.738009 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.738017 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.738031 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.738042 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.841119 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.841170 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.841182 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.841203 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.841216 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.944415 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.944477 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.944496 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.944519 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.944536 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:09Z","lastTransitionTime":"2026-03-20T10:57:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:09 crc kubenswrapper[4772]: I0320 10:57:09.963989 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.049296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.049327 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.049337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.049354 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.049365 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.151763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.151835 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.151866 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.151879 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.151888 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.254288 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.254346 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.254364 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.254387 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.254404 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.307969 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/2.log" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.312375 4772 scope.go:117] "RemoveContainer" containerID="6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.312655 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.324895 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.341992 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.354000 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.356388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.356416 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.356424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.356437 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.356446 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.366789 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.383681 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.411202 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.426824 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.435674 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.457715 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.459082 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.459121 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.459130 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.459145 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.459156 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.474964 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.486400 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.496559 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.516463 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.529166 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.543496 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.558298 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:10Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.560739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.560768 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.560778 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.560794 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.560807 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.641854 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.641925 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.641977 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.641982 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.642078 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.642216 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.642400 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.642431 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.655204 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655409 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:57:42.655380841 +0000 UTC m=+148.746347356 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.655469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.655499 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.655523 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655605 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655659 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:42.655645038 +0000 UTC m=+148.746611543 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655711 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655713 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655806 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:42.655782662 +0000 UTC m=+148.746749177 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655824 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655868 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.655942 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:42.655929876 +0000 UTC m=+148.746896371 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.663058 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.663108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.663117 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.663131 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.663139 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.756352 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.756596 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.756623 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.756642 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:10 crc kubenswrapper[4772]: E0320 10:57:10.756720 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:42.756697214 +0000 UTC m=+148.847663729 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.765830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.765933 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.765963 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.765994 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.766016 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.868465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.868528 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.868552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.868580 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.868600 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.971646 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.971697 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.971710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.971727 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:10 crc kubenswrapper[4772]: I0320 10:57:10.971737 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:10Z","lastTransitionTime":"2026-03-20T10:57:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.074812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.074915 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.074940 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.074971 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.074993 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.177418 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.177461 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.177473 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.177490 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.177504 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.280793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.280881 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.280901 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.280925 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.281038 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.383721 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.383793 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.383812 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.383860 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.383880 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.486412 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.486471 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.486487 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.486512 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.486529 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.589171 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.589710 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.589750 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.589777 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.589812 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.697292 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.697331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.697341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.697355 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.697365 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.801144 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.801212 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.801232 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.801258 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.801275 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.904985 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.905057 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.905079 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.905108 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:11 crc kubenswrapper[4772]: I0320 10:57:11.905132 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:11Z","lastTransitionTime":"2026-03-20T10:57:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.007944 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.008006 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.008028 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.008054 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.008074 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.111220 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.111265 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.111278 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.111296 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.111307 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.214561 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.214982 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.215183 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.215329 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.215458 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.317760 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.317814 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.317830 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.317884 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.317902 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.421283 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.421341 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.421365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.421392 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.421415 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.524237 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.524307 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.524330 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.524356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.524377 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.627264 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.627359 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.627378 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.627474 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.627496 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.641943 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.641984 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.642106 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:12 crc kubenswrapper[4772]: E0320 10:57:12.642112 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.642141 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:12 crc kubenswrapper[4772]: E0320 10:57:12.643013 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:12 crc kubenswrapper[4772]: E0320 10:57:12.643092 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:12 crc kubenswrapper[4772]: E0320 10:57:12.643205 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.730664 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.730739 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.730759 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.730786 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.730800 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.834141 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.834196 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.834212 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.834233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.834252 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.936492 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.936540 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.936552 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.936570 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:12 crc kubenswrapper[4772]: I0320 10:57:12.936583 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:12Z","lastTransitionTime":"2026-03-20T10:57:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.040058 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.040090 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.040098 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.040111 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.040122 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.142923 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.142978 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.142992 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.143008 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.143020 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.246272 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.246315 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.246328 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.246349 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.246365 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.348578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.348647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.348661 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.348714 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.348727 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.451171 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.451199 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.451210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.451222 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.451231 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.553780 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.553825 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.553871 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.553893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.553906 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.656388 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.656424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.656455 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.656470 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.656481 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.760088 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.760126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.760135 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.760152 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.760162 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.862628 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.862688 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.862705 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.862727 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.862743 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.965684 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.966772 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.966942 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.967086 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:13 crc kubenswrapper[4772]: I0320 10:57:13.967342 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:13Z","lastTransitionTime":"2026-03-20T10:57:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.070210 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.070269 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.070287 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.070310 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.070328 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:14Z","lastTransitionTime":"2026-03-20T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.173593 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.173659 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.173700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.173730 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.173787 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:14Z","lastTransitionTime":"2026-03-20T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.276233 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.276298 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.276314 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.276337 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.276355 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:14Z","lastTransitionTime":"2026-03-20T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.379546 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.379594 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.379612 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.379633 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.379650 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:14Z","lastTransitionTime":"2026-03-20T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.482950 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.483007 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.483026 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.483050 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.483067 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:14Z","lastTransitionTime":"2026-03-20T10:57:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:14 crc kubenswrapper[4772]: E0320 10:57:14.583823 4772 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.641186 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.641222 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.641356 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:14 crc kubenswrapper[4772]: E0320 10:57:14.641585 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.641632 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:14 crc kubenswrapper[4772]: E0320 10:57:14.642102 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:14 crc kubenswrapper[4772]: E0320 10:57:14.642192 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:14 crc kubenswrapper[4772]: E0320 10:57:14.642334 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.662122 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.681665 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.697481 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.711532 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.732130 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-apiserver-check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.760987 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: E0320 10:57:14.772935 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.778565 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.811098 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.832359 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.848033 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.866618 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.888730 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.905007 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.924480 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.946994 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:14 crc kubenswrapper[4772]: I0320 10:57:14.962593 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:14Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.535696 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.535757 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.535773 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.535796 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.535814 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:16Z","lastTransitionTime":"2026-03-20T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.557893 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.563082 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.563106 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.563114 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.563126 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.563135 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:16Z","lastTransitionTime":"2026-03-20T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.579175 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.583550 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.583621 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.583647 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.583677 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.583698 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:16Z","lastTransitionTime":"2026-03-20T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.598913 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.604431 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.604495 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.604513 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.604539 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.604556 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:16Z","lastTransitionTime":"2026-03-20T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.625151 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.629578 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.629668 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.629720 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.629744 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.629762 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:16Z","lastTransitionTime":"2026-03-20T10:57:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.641535 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.641589 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.641547 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.641694 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:16 crc kubenswrapper[4772]: I0320 10:57:16.641781 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.641811 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.641977 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.642126 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.651939 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:16Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:16Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:16 crc kubenswrapper[4772]: E0320 10:57:16.652156 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:18 crc kubenswrapper[4772]: I0320 10:57:18.641797 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:18 crc kubenswrapper[4772]: I0320 10:57:18.641966 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:18 crc kubenswrapper[4772]: I0320 10:57:18.642057 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:18 crc kubenswrapper[4772]: E0320 10:57:18.642304 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:18 crc kubenswrapper[4772]: E0320 10:57:18.642438 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:18 crc kubenswrapper[4772]: E0320 10:57:18.642503 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:18 crc kubenswrapper[4772]: I0320 10:57:18.642136 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:18 crc kubenswrapper[4772]: E0320 10:57:18.643586 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:19 crc kubenswrapper[4772]: E0320 10:57:19.773968 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.777722 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.799366 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.819700 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.837914 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.864099 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.880988 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.901670 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.919688 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.935975 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.954260 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:19 crc kubenswrapper[4772]: I0320 10:57:19.984996 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:19Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.009663 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.043700 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.063810 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.083147 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.098010 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.111099 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:20Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.641269 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.641309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.641388 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:20 crc kubenswrapper[4772]: I0320 10:57:20.641460 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:20 crc kubenswrapper[4772]: E0320 10:57:20.641454 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:20 crc kubenswrapper[4772]: E0320 10:57:20.641545 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:20 crc kubenswrapper[4772]: E0320 10:57:20.641753 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:20 crc kubenswrapper[4772]: E0320 10:57:20.642151 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:22 crc kubenswrapper[4772]: I0320 10:57:22.641098 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:22 crc kubenswrapper[4772]: E0320 10:57:22.641287 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:22 crc kubenswrapper[4772]: I0320 10:57:22.641607 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:22 crc kubenswrapper[4772]: E0320 10:57:22.641691 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:22 crc kubenswrapper[4772]: I0320 10:57:22.641943 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:22 crc kubenswrapper[4772]: E0320 10:57:22.642057 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:22 crc kubenswrapper[4772]: I0320 10:57:22.642130 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:22 crc kubenswrapper[4772]: E0320 10:57:22.642308 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.092741 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.092976 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.093075 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:57:56.093047651 +0000 UTC m=+162.184014146 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.641746 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.641899 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.641992 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.642005 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.642060 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.642107 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.642210 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.642318 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.664042 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.682305 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.699783 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.721288 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.731077 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.745136 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.762649 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: E0320 10:57:24.774767 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.777324 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.798056 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.812514 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.826225 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.837381 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.848902 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.859292 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.871885 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:24 crc kubenswrapper[4772]: I0320 10:57:24.892378 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:24Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:25 crc kubenswrapper[4772]: I0320 10:57:25.643221 4772 scope.go:117] "RemoveContainer" containerID="6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38" Mar 20 10:57:25 crc kubenswrapper[4772]: E0320 10:57:25.643492 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:57:25 crc kubenswrapper[4772]: I0320 10:57:25.657100 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.378621 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/0.log" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.378667 4772 generic.go:334] "Generic (PLEG): container finished" podID="a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d" containerID="de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66" exitCode=1 Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.378722 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerDied","Data":"de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66"} Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.379232 4772 scope.go:117] "RemoveContainer" containerID="de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.397460 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.433139 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.458986 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.476646 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.509082 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.534402 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.554074 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.570154 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.588414 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.600437 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.620182 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.638541 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.641401 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.641427 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:26 crc kubenswrapper[4772]: E0320 10:57:26.641515 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.641571 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.641571 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:26 crc kubenswrapper[4772]: E0320 10:57:26.641688 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:26 crc kubenswrapper[4772]: E0320 10:57:26.641748 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:26 crc kubenswrapper[4772]: E0320 10:57:26.641794 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.653547 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.670902 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.687384 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.702640 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:26 crc kubenswrapper[4772]: I0320 10:57:26.714856 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:26Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.053411 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.053479 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.053499 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.053524 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.053541 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:27Z","lastTransitionTime":"2026-03-20T10:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:27 crc kubenswrapper[4772]: E0320 10:57:27.075407 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.080331 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.080384 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.080401 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.080427 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.080445 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:27Z","lastTransitionTime":"2026-03-20T10:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:27 crc kubenswrapper[4772]: E0320 10:57:27.096663 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.101878 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.101932 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.101949 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.101972 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.101989 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:27Z","lastTransitionTime":"2026-03-20T10:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:27 crc kubenswrapper[4772]: E0320 10:57:27.122037 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.127459 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.127529 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.127555 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.127584 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.127606 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:27Z","lastTransitionTime":"2026-03-20T10:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:27 crc kubenswrapper[4772]: E0320 10:57:27.149574 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.159404 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.159465 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.159488 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.159523 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.159547 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:27Z","lastTransitionTime":"2026-03-20T10:57:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:27 crc kubenswrapper[4772]: E0320 10:57:27.181442 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: E0320 10:57:27.181706 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.384801 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/0.log" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.384934 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerStarted","Data":"b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003"} Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.422146 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.444363 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.466603 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.487489 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.504518 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.522734 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.544892 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.564996 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.585809 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.602532 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.624211 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.640156 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.659343 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.675280 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.687760 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.709017 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:27 crc kubenswrapper[4772]: I0320 10:57:27.737900 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:27Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:28 crc kubenswrapper[4772]: I0320 10:57:28.641005 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:28 crc kubenswrapper[4772]: I0320 10:57:28.641065 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:28 crc kubenswrapper[4772]: I0320 10:57:28.641160 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:28 crc kubenswrapper[4772]: E0320 10:57:28.641348 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:28 crc kubenswrapper[4772]: I0320 10:57:28.641422 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:28 crc kubenswrapper[4772]: E0320 10:57:28.641611 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:28 crc kubenswrapper[4772]: E0320 10:57:28.641685 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:28 crc kubenswrapper[4772]: E0320 10:57:28.641821 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:29 crc kubenswrapper[4772]: E0320 10:57:29.777029 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:30 crc kubenswrapper[4772]: I0320 10:57:30.641161 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:30 crc kubenswrapper[4772]: I0320 10:57:30.641210 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:30 crc kubenswrapper[4772]: I0320 10:57:30.641243 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:30 crc kubenswrapper[4772]: I0320 10:57:30.641161 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:30 crc kubenswrapper[4772]: E0320 10:57:30.641303 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:30 crc kubenswrapper[4772]: E0320 10:57:30.641390 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:30 crc kubenswrapper[4772]: E0320 10:57:30.641483 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:30 crc kubenswrapper[4772]: E0320 10:57:30.641552 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:32 crc kubenswrapper[4772]: I0320 10:57:32.641507 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:32 crc kubenswrapper[4772]: I0320 10:57:32.641678 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:32 crc kubenswrapper[4772]: I0320 10:57:32.641678 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:32 crc kubenswrapper[4772]: I0320 10:57:32.641825 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:32 crc kubenswrapper[4772]: E0320 10:57:32.642082 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:32 crc kubenswrapper[4772]: E0320 10:57:32.642179 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:32 crc kubenswrapper[4772]: E0320 10:57:32.642247 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:32 crc kubenswrapper[4772]: E0320 10:57:32.642279 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.641988 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.642108 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:34 crc kubenswrapper[4772]: E0320 10:57:34.642330 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.642350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.642389 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:34 crc kubenswrapper[4772]: E0320 10:57:34.642491 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:34 crc kubenswrapper[4772]: E0320 10:57:34.642610 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:34 crc kubenswrapper[4772]: E0320 10:57:34.642779 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.664410 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.693054 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.712977 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.729114 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.746442 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.757803 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.768891 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: E0320 10:57:34.777835 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.785159 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.799114 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.820171 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.832190 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.848225 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.864344 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.878464 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.891750 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.907452 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:34 crc kubenswrapper[4772]: I0320 10:57:34.921889 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:34Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:36 crc kubenswrapper[4772]: I0320 10:57:36.641530 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:36 crc kubenswrapper[4772]: I0320 10:57:36.641542 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:36 crc kubenswrapper[4772]: I0320 10:57:36.641722 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:36 crc kubenswrapper[4772]: I0320 10:57:36.641731 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:36 crc kubenswrapper[4772]: E0320 10:57:36.641927 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:36 crc kubenswrapper[4772]: E0320 10:57:36.642304 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:36 crc kubenswrapper[4772]: E0320 10:57:36.642463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:36 crc kubenswrapper[4772]: E0320 10:57:36.642525 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:36 crc kubenswrapper[4772]: I0320 10:57:36.655233 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.271299 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.271365 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.271390 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.271424 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.271450 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:37Z","lastTransitionTime":"2026-03-20T10:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:37 crc kubenswrapper[4772]: E0320 10:57:37.295393 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.300820 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.301075 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.301218 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.301356 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.301518 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:37Z","lastTransitionTime":"2026-03-20T10:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:37 crc kubenswrapper[4772]: E0320 10:57:37.326082 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.330717 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.330756 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.330767 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.330783 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.330795 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:37Z","lastTransitionTime":"2026-03-20T10:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:37 crc kubenswrapper[4772]: E0320 10:57:37.348379 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.352700 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.352735 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.352746 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.352763 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.352777 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:37Z","lastTransitionTime":"2026-03-20T10:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:37 crc kubenswrapper[4772]: E0320 10:57:37.374599 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.379065 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.379212 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.379318 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.379827 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.379911 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:37Z","lastTransitionTime":"2026-03-20T10:57:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:37 crc kubenswrapper[4772]: E0320 10:57:37.406518 4772 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404560Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865360Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:37Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"339e6ef0-bc07-454a-8bb1-f97440045fc1\\\",\\\"systemUUID\\\":\\\"6fe7d706-49bf-443a-bc98-4f48ecaccc59\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:37Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:37 crc kubenswrapper[4772]: E0320 10:57:37.406991 4772 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Mar 20 10:57:37 crc kubenswrapper[4772]: I0320 10:57:37.642967 4772 scope.go:117] "RemoveContainer" containerID="6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.431782 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/2.log" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.436821 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4"} Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.437551 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.469311 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.490271 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.511761 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.527558 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.540582 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.553686 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa090047-d48b-4b5d-84a0-14908a9bd79c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb9033b2be08ce32fd65b95fa314492bd2a117445313699e62962f27e5c48356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b875b88a814ccfbc5bf71292c5869f4bd7b7f005383d7ae40a6027987ce57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d33fd0873f0fe72f8f380e8f57d887c6908f3dfe5d207f461855c83796e1b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.568055 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.580495 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.595624 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.607463 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.625919 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.640932 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.640933 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.640991 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.641009 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:38 crc kubenswrapper[4772]: E0320 10:57:38.641145 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:38 crc kubenswrapper[4772]: E0320 10:57:38.641309 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:38 crc kubenswrapper[4772]: E0320 10:57:38.641483 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:38 crc kubenswrapper[4772]: E0320 10:57:38.641571 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.646734 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.665896 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.684877 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.706091 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.726567 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.742412 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:38 crc kubenswrapper[4772]: I0320 10:57:38.768721 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:38Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.444491 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/3.log" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.445466 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/2.log" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.449280 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" exitCode=1 Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.449329 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4"} Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.449376 4772 scope.go:117] "RemoveContainer" containerID="6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.452608 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 10:57:39 crc kubenswrapper[4772]: E0320 10:57:39.452976 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.471119 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.499881 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.518108 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.532111 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.550740 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.578960 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://6ff055293046173636b8fb876151384fd8cf6c095eb68ef3c763d49bc642cd38\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:08Z\\\",\\\"message\\\":\\\" 6966 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0320 10:57:08.481498 6966 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.481774 6966 reflector.go:311] Stopping reflector *v1.UserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0320 10:57:08.482008 6966 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0320 10:57:08.482687 6966 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:08.482809 6966 handler.go:190] Sending *v1.Node event handler 2 for removal\\\\nI0320 10:57:08.482891 6966 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:08.482934 6966 factory.go:656] Stopping watch factory\\\\nI0320 10:57:08.482967 6966 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:08.483008 6966 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:08.483033 6966 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0320 10:57:08.483175 6966 ovnkube.go:137] failed to run ov\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:07Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:38Z\\\",\\\"message\\\":\\\"0320 10:57:38.928756 7303 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:57:38.928795 7303 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:57:38.928823 7303 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:57:38.928829 7303 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:57:38.928864 7303 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:57:38.928864 7303 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:38.928882 7303 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:38.928888 7303 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:57:38.928899 7303 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:57:38.928904 7303 factory.go:656] Stopping watch factory\\\\nI0320 10:57:38.928921 7303 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:38.928917 7303 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:57:38.928921 7303 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:57:38.928940 7303 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:38.928895 7303 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:57:38.928943 7303 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:38Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.610875 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.630757 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.652933 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.675523 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.695032 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.713481 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa090047-d48b-4b5d-84a0-14908a9bd79c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb9033b2be08ce32fd65b95fa314492bd2a117445313699e62962f27e5c48356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b875b88a814ccfbc5bf71292c5869f4bd7b7f005383d7ae40a6027987ce57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d33fd0873f0fe72f8f380e8f57d887c6908f3dfe5d207f461855c83796e1b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.729745 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.751336 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.773568 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: E0320 10:57:39.779538 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.787742 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.817345 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:39 crc kubenswrapper[4772]: I0320 10:57:39.837002 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:39Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.456575 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/3.log" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.462616 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 10:57:40 crc kubenswrapper[4772]: E0320 10:57:40.462857 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.482145 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.498399 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.513416 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.534176 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.557691 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.591794 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:38Z\\\",\\\"message\\\":\\\"0320 10:57:38.928756 7303 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:57:38.928795 7303 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:57:38.928823 7303 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:57:38.928829 7303 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:57:38.928864 7303 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:57:38.928864 7303 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:38.928882 7303 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:38.928888 7303 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:57:38.928899 7303 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:57:38.928904 7303 factory.go:656] Stopping watch factory\\\\nI0320 10:57:38.928921 7303 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:38.928917 7303 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:57:38.928921 7303 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:57:38.928940 7303 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:38.928895 7303 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:57:38.928943 7303 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.611819 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.632729 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.641817 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.641904 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:40 crc kubenswrapper[4772]: E0320 10:57:40.642109 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.642292 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.642422 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:40 crc kubenswrapper[4772]: E0320 10:57:40.642327 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:40 crc kubenswrapper[4772]: E0320 10:57:40.642597 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:40 crc kubenswrapper[4772]: E0320 10:57:40.642932 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.655238 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.673207 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.715522 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.736800 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.755984 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.773214 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.800426 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.819335 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.837665 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa090047-d48b-4b5d-84a0-14908a9bd79c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb9033b2be08ce32fd65b95fa314492bd2a117445313699e62962f27e5c48356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b875b88a814ccfbc5bf71292c5869f4bd7b7f005383d7ae40a6027987ce57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d33fd0873f0fe72f8f380e8f57d887c6908f3dfe5d207f461855c83796e1b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:40 crc kubenswrapper[4772]: I0320 10:57:40.867559 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:40Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.641165 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.641252 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.641252 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.641165 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.641397 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.641506 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.641614 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.641695 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.684253 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.684421 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.684470 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684531 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.684468181 +0000 UTC m=+212.775434716 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.684621 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684627 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684775 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684874 4772 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684776 4772 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684956 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.684915933 +0000 UTC m=+212.775882428 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.685012 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.684980405 +0000 UTC m=+212.775946920 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.684676 4772 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.685090 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.685074917 +0000 UTC m=+212.776041432 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: I0320 10:57:42.785340 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.785491 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.785511 4772 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.785524 4772 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:42 crc kubenswrapper[4772]: E0320 10:57:42.785571 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.785555852 +0000 UTC m=+212.876522337 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.641211 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.641246 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.641279 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:44 crc kubenswrapper[4772]: E0320 10:57:44.641322 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.641462 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:44 crc kubenswrapper[4772]: E0320 10:57:44.641476 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:44 crc kubenswrapper[4772]: E0320 10:57:44.641509 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:44 crc kubenswrapper[4772]: E0320 10:57:44.641570 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.654124 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-k4qd4" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f7e1e7e8-3b09-4e05-a31d-a74713a885f3\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d65a866e0b06a91ae54039c3c3a23be9e1116199149a7e4248a6dbe9a59d379f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-qn484\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:38Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-k4qd4\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.677400 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-tmktf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"8e02a490-1cd4-40f2-baeb-f04ce5317e4d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://c072350931f573d203b5ed8630876e8d3d981f7478203a896871cd8b5f416553\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:45Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4de2fe11da12ae2440bc919d30502a8511d8e04decefbf2a983c84d76de3ea6e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://00ab4773152ad03d93bab39fc219acee6cc2e2620202dddb6f32b7ed0e301093\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://b44fb5833837aaebd140c8179eea01e350fa91d450ad0859f3375d56884cd4da\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ebf240945363c3737faffe3087d486d285534027c5df73e185640c5707a4d795\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:42Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28480097d9b89caf1400f46cabb9e172dd301c9e646d4ddae3f95c6acdd22af9\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://615e8c3bf221906829d46057fabd60e436aa45c5f9497b812c0497f452e2ee17\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:44Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:44Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5wfrr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-tmktf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.695554 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9742872d-abdb-4fdc-a4d2-48d04fa61dbf\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2a24a213bea61f3fc0f750f0fa093e490e24a83b30fdcb6ab0efee5f9a356f9c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a806840311bce4944cec10e09ffb5debc3be7e6f20b40c80292ddcf297db2084\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-x4qpq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:51Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-kzxjb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.712492 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"aa090047-d48b-4b5d-84a0-14908a9bd79c\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:07Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://eb9033b2be08ce32fd65b95fa314492bd2a117445313699e62962f27e5c48356\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://97b875b88a814ccfbc5bf71292c5869f4bd7b7f005383d7ae40a6027987ce57a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://6d33fd0873f0fe72f8f380e8f57d887c6908f3dfe5d207f461855c83796e1b16\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9d4830c4ca712050e5d5d77e415c0b08f95d356694b48914c735a2c51ccc8956\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.733273 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d4a1997e-58f5-4755-9a44-7d63be4de00b\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:58Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ad76f823c12a841777c33e9050b031bfdc49600d43524c67d2b54b39a1ae8825\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://38b6b1019097a8fa22e4432221e41387c82c88eb58f8012e316f9b1a2b738957\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:55:41Z\\\",\\\"message\\\":\\\"+ timeout 3m /bin/bash -exuo pipefail -c 'while [ -n \\\\\\\"$(ss -Htanop \\\\\\\\( sport = 10357 \\\\\\\\))\\\\\\\" ]; do sleep 1; done'\\\\n++ ss -Htanop '(' sport = 10357 ')'\\\\n+ '[' -n '' ']'\\\\n+ exec cluster-policy-controller start --config=/etc/kubernetes/static-pod-resources/configmaps/cluster-policy-controller-config/config.yaml --kubeconfig=/etc/kubernetes/static-pod-resources/configmaps/controller-manager-kubeconfig/kubeconfig --namespace=openshift-kube-controller-manager -v=2\\\\nI0320 10:55:16.963663 1 leaderelection.go:121] The leader election gives 4 retries and allows for 30s of clock skew. The kube-apiserver downtime tolerance is 78s. Worst non-graceful lease acquisition is 2m43s. Worst graceful lease acquisition is {26s}.\\\\nI0320 10:55:16.966203 1 observer_polling.go:159] Starting file observer\\\\nI0320 10:55:17.025013 1 builder.go:298] cluster-policy-controller version 4.18.0-202501230001.p0.g5fd8525.assembly.stream.el9-5fd8525-5fd852525909ce6eab52972ba9ce8fcf56528eb9\\\\nI0320 10:55:17.031237 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.crt::/etc/kubernetes/static-pod-resources/secrets/serving-cert/tls.key\\\\\\\"\\\\nI0320 10:55:41.926937 1 cmd.go:138] Received SIGTERM or SIGINT signal, shutting down controller.\\\\nF0320 10:55:41.927086 1 cmd.go:179] failed checking apiserver connectivity: Get \\\\\\\"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/openshift-kube-controller-manager/leases/cluster-policy-controller-lock\\\\\\\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:55:41Z is after 2026-02-23T05:33:13Z\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://fa8330ebb1d9f35b3265890086718ee7a34b8129ab2a14536e5eaf679ccf417a\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://b55dc0780a7d7d7ce5e51f53e62250dc1b32daf50598a8a7858d4bd58857affd\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.752202 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.771318 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://e469b15f3eccbdcb2264d52ae17162ba3c7a87e9e64c936256c698269e80ce37\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: E0320 10:57:44.780344 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.788405 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:43Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://732f3b7932607184b864bb5293c74ceec920089196dcee92e72c94ab11404c7f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:42Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.812934 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.830428 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7b18d454e45d8a68550da0d67465a60ed627a556c51985002bb897ca2da2119f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-d5pfq\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-ltsw5\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.841305 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-95tl8" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9de5f9ae-372d-4c5f-89ec-93a96431485b\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:45Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:46Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://8c29847af5fe7fa60fda6e39378998810cb1cabc59058b71015cc31cf62262af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:46Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-bvfpf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:45Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-95tl8\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.854382 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"69ce1b19-6ab4-4f21-bf6b-ffe4eca38794\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:19Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5\\\",\\\"exitCode\\\":255,\\\"finishedAt\\\":\\\"2026-03-20T10:56:14Z\\\",\\\"message\\\":\\\"le observer\\\\nW0320 10:56:14.345751 1 builder.go:272] unable to get owner reference (falling back to namespace): pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\nI0320 10:56:14.345983 1 builder.go:304] check-endpoints version 4.18.0-202502101302.p0.g763313c.assembly.stream.el9-763313c-763313c860ea43fcfc9b1ac00ebae096b57c078e\\\\nI0320 10:56:14.347051 1 dynamic_serving_content.go:116] \\\\\\\"Loaded a new cert/key pair\\\\\\\" name=\\\\\\\"serving-cert::/tmp/serving-cert-3147707038/tls.crt::/tmp/serving-cert-3147707038/tls.key\\\\\\\"\\\\nI0320 10:56:14.798421 1 requestheader_controller.go:247] Loaded a new request header values for RequestHeaderAuthRequestController\\\\nI0320 10:56:14.801037 1 maxinflight.go:139] \\\\\\\"Initialized nonMutatingChan\\\\\\\" len=400\\\\nI0320 10:56:14.801053 1 maxinflight.go:145] \\\\\\\"Initialized mutatingChan\\\\\\\" len=200\\\\nI0320 10:56:14.801068 1 maxinflight.go:116] \\\\\\\"Set denominator for readonly requests\\\\\\\" limit=400\\\\nI0320 10:56:14.801073 1 maxinflight.go:120] \\\\\\\"Set denominator for mutating requests\\\\\\\" limit=200\\\\nI0320 10:56:14.808876 1 secure_serving.go:57] Forcing use of http/1.1 only\\\\nW0320 10:56:14.808895 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808901 1 secure_serving.go:69] Use of insecure cipher 'TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256' detected.\\\\nW0320 10:56:14.808905 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_GCM_SHA256' detected.\\\\nW0320 10:56:14.808910 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_GCM_SHA384' detected.\\\\nW0320 10:56:14.808912 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_128_CBC_SHA' detected.\\\\nW0320 10:56:14.808915 1 secure_serving.go:69] Use of insecure cipher 'TLS_RSA_WITH_AES_256_CBC_SHA' detected.\\\\nI0320 10:56:14.808953 1 genericapiserver.go:533] MuxAndDiscoveryComplete has all endpoints registered and discovery information is complete\\\\nF0320 10:56:14.811060 1 cmd.go:182] pods \\\\\\\"kube-apiserver-crc\\\\\\\" not found\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:13Z\\\"}},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":4,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.883941 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d62da04c-5422-4320-9352-8959b89501be\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:41Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:41Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:38Z\\\",\\\"message\\\":\\\"0320 10:57:38.928756 7303 handler.go:190] Sending *v1.Node event handler 7 for removal\\\\nI0320 10:57:38.928795 7303 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0320 10:57:38.928823 7303 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0320 10:57:38.928829 7303 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0320 10:57:38.928864 7303 handler.go:190] Sending *v1.NetworkPolicy event handler 4 for removal\\\\nI0320 10:57:38.928864 7303 handler.go:208] Removed *v1.Node event handler 2\\\\nI0320 10:57:38.928882 7303 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0320 10:57:38.928888 7303 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0320 10:57:38.928899 7303 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0320 10:57:38.928904 7303 factory.go:656] Stopping watch factory\\\\nI0320 10:57:38.928921 7303 ovnkube.go:599] Stopped ovnkube\\\\nI0320 10:57:38.928917 7303 handler.go:208] Removed *v1.Node event handler 7\\\\nI0320 10:57:38.928921 7303 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0320 10:57:38.928940 7303 handler.go:208] Removed *v1.Namespace event handler 1\\\\nI0320 10:57:38.928895 7303 handler.go:208] Removed *v1.EgressIP event handler 8\\\\nI0320 10:57:38.928943 7303 handler.go:208] Removed *v1.NetworkPolicy event handler 4\\\\nI0320 10\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:57:38Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:40Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-js95g\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-z8p9x\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.900812 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-7fpq9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:39Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:57:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-03-20T10:57:26Z\\\",\\\"message\\\":\\\"2026-03-20T10:56:40+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc\\\\n2026-03-20T10:56:40+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_142b0085-e6bd-4ac2-8b62-91a4ef66d4cc to /host/opt/cni/bin/\\\\n2026-03-20T10:56:41Z [verbose] multus-daemon started\\\\n2026-03-20T10:56:41Z [verbose] Readiness Indicator file check\\\\n2026-03-20T10:57:26Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:57:26Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-xmwgg\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:39Z\\\"}}\" for pod \"openshift-multus\"/\"multus-7fpq9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.911796 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2ac5550b-02eb-48b4-b62a-e21dd4429249\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:52Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-7srzh\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:56:52Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-m8kjd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.934292 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d1339b92-59e9-415d-8932-f7b1063418cc\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:18Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:30Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-03-20T10:55:14Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://6219b654aaac118078bebe1ffd61d6b794f63b5260ecc2d14ce77c0a0322680a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://5c1eb176d3c6fb87cca8631a6eb2aeacbf8406cd378877513ef3eab6949ef32b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9f73859e18a17960d8af9f6025ba1951c773e6cab063bdde5e108727c0e678f4\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://1cb6f1826a8adc5254fbe3c0e742fce2e65d7f571cfaa009685420c62be5d4a9\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:20Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ff34e76ecb7e91ea3e06848f0175c3677a457618b6fc4185292361e180ce9ddb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:55:18Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://fc51403a0bc546faa1c415e742a72f560a2f484b58162f382414b8802c6edc42\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:15Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:15Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://28dd4e1b59b2b7604e007469e12b69764e0acfb2ab500a3ad9d0853f6ff15bea\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:16Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:16Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://cb9383801818c2b5b7e27bd7e0ce3e1490a717e30c47026ab2ed13955b51ef53\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-03-20T10:55:17Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-03-20T10:55:17Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-03-20T10:55:14Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.976157 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:38Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:44 crc kubenswrapper[4772]: I0320 10:57:44.997888 4772 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-03-20T10:56:40Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2276317fc2e2133007482836d3d7a730a44ae38fbff64315f37f8ff6d66c4213\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4337df50218d3e7c21efab2563f4960382fd005e368fe86d453405cb2dda72c7\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-03-20T10:56:39Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-03-20T10:57:44Z is after 2025-08-24T17:21:41Z" Mar 20 10:57:46 crc kubenswrapper[4772]: I0320 10:57:46.641006 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:46 crc kubenswrapper[4772]: I0320 10:57:46.641463 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:46 crc kubenswrapper[4772]: I0320 10:57:46.641347 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:46 crc kubenswrapper[4772]: I0320 10:57:46.641091 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:46 crc kubenswrapper[4772]: E0320 10:57:46.641636 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:46 crc kubenswrapper[4772]: E0320 10:57:46.641811 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:46 crc kubenswrapper[4772]: E0320 10:57:46.642021 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:46 crc kubenswrapper[4772]: E0320 10:57:46.642099 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.748893 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.748977 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.749000 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.749030 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.749061 4772 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-03-20T10:57:47Z","lastTransitionTime":"2026-03-20T10:57:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.828060 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb"] Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.830542 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.835542 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.835753 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.836055 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.836575 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.924134 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podStartSLOduration=110.924104852 podStartE2EDuration="1m50.924104852s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:47.905459219 +0000 UTC m=+153.996425754" watchObservedRunningTime="2026-03-20 10:57:47.924104852 +0000 UTC m=+154.015071377" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.940390 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/257cc5c3-16df-43f0-acb5-b8716044c771-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.940448 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cc5c3-16df-43f0-acb5-b8716044c771-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.940478 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/257cc5c3-16df-43f0-acb5-b8716044c771-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.940494 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/257cc5c3-16df-43f0-acb5-b8716044c771-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.940535 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/257cc5c3-16df-43f0-acb5-b8716044c771-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.946374 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-95tl8" podStartSLOduration=110.946360985 podStartE2EDuration="1m50.946360985s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:47.9244018 +0000 UTC m=+154.015368315" watchObservedRunningTime="2026-03-20 10:57:47.946360985 +0000 UTC m=+154.037327470" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.966199 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=56.966183891 podStartE2EDuration="56.966183891s" podCreationTimestamp="2026-03-20 10:56:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:47.965996525 +0000 UTC m=+154.056963050" watchObservedRunningTime="2026-03-20 10:57:47.966183891 +0000 UTC m=+154.057150376" Mar 20 10:57:47 crc kubenswrapper[4772]: I0320 10:57:47.993154 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.005413 4772 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.022411 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=65.022387705 podStartE2EDuration="1m5.022387705s" podCreationTimestamp="2026-03-20 10:56:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.021732567 +0000 UTC m=+154.112699102" watchObservedRunningTime="2026-03-20 10:57:48.022387705 +0000 UTC m=+154.113354190" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043263 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/257cc5c3-16df-43f0-acb5-b8716044c771-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043356 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cc5c3-16df-43f0-acb5-b8716044c771-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/257cc5c3-16df-43f0-acb5-b8716044c771-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043428 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/257cc5c3-16df-43f0-acb5-b8716044c771-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043445 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/257cc5c3-16df-43f0-acb5-b8716044c771-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/257cc5c3-16df-43f0-acb5-b8716044c771-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.043644 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/257cc5c3-16df-43f0-acb5-b8716044c771-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.044669 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/257cc5c3-16df-43f0-acb5-b8716044c771-service-ca\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.052653 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/257cc5c3-16df-43f0-acb5-b8716044c771-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.066302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/257cc5c3-16df-43f0-acb5-b8716044c771-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-qn5wb\" (UID: \"257cc5c3-16df-43f0-acb5-b8716044c771\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.077056 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-7fpq9" podStartSLOduration=111.077033495 podStartE2EDuration="1m51.077033495s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.076544042 +0000 UTC m=+154.167510537" watchObservedRunningTime="2026-03-20 10:57:48.077033495 +0000 UTC m=+154.167999980" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.104298 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=12.104270029 podStartE2EDuration="12.104270029s" podCreationTimestamp="2026-03-20 10:57:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.102687575 +0000 UTC m=+154.193654060" watchObservedRunningTime="2026-03-20 10:57:48.104270029 +0000 UTC m=+154.195236524" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.118208 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=23.118184978 podStartE2EDuration="23.118184978s" podCreationTimestamp="2026-03-20 10:57:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.116932084 +0000 UTC m=+154.207898569" watchObservedRunningTime="2026-03-20 10:57:48.118184978 +0000 UTC m=+154.209151463" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.153778 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.160644 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-k4qd4" podStartSLOduration=111.160619527 podStartE2EDuration="1m51.160619527s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.153440876 +0000 UTC m=+154.244407401" watchObservedRunningTime="2026-03-20 10:57:48.160619527 +0000 UTC m=+154.251586022" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.178958 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-tmktf" podStartSLOduration=111.17893405 podStartE2EDuration="1m51.17893405s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.177585992 +0000 UTC m=+154.268552487" watchObservedRunningTime="2026-03-20 10:57:48.17893405 +0000 UTC m=+154.269900535" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.197400 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-kzxjb" podStartSLOduration=110.197380267 podStartE2EDuration="1m50.197380267s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.195874965 +0000 UTC m=+154.286841470" watchObservedRunningTime="2026-03-20 10:57:48.197380267 +0000 UTC m=+154.288346752" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.492693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" event={"ID":"257cc5c3-16df-43f0-acb5-b8716044c771","Type":"ContainerStarted","Data":"8c28cc157127b40d1a1ec183f047509b0fab890238746df9f0e6ce4d80bea18d"} Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.492777 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" event={"ID":"257cc5c3-16df-43f0-acb5-b8716044c771","Type":"ContainerStarted","Data":"a321f09eba144c014bf1ce70816da3eade6f18fb12589f5a12b7c2a42796aa59"} Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.523792 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-qn5wb" podStartSLOduration=111.523768631 podStartE2EDuration="1m51.523768631s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:57:48.522878806 +0000 UTC m=+154.613845341" watchObservedRunningTime="2026-03-20 10:57:48.523768631 +0000 UTC m=+154.614735126" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.641587 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.641650 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:48 crc kubenswrapper[4772]: E0320 10:57:48.641719 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.641650 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:48 crc kubenswrapper[4772]: E0320 10:57:48.641888 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:48 crc kubenswrapper[4772]: E0320 10:57:48.641935 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:48 crc kubenswrapper[4772]: I0320 10:57:48.642218 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:48 crc kubenswrapper[4772]: E0320 10:57:48.642587 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:49 crc kubenswrapper[4772]: E0320 10:57:49.781390 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:50 crc kubenswrapper[4772]: I0320 10:57:50.641194 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:50 crc kubenswrapper[4772]: I0320 10:57:50.641246 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:50 crc kubenswrapper[4772]: E0320 10:57:50.641374 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:50 crc kubenswrapper[4772]: I0320 10:57:50.641390 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:50 crc kubenswrapper[4772]: I0320 10:57:50.641209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:50 crc kubenswrapper[4772]: E0320 10:57:50.641533 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:50 crc kubenswrapper[4772]: E0320 10:57:50.641659 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:50 crc kubenswrapper[4772]: E0320 10:57:50.641776 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:52 crc kubenswrapper[4772]: I0320 10:57:52.641331 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:52 crc kubenswrapper[4772]: I0320 10:57:52.641340 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:52 crc kubenswrapper[4772]: I0320 10:57:52.641340 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:52 crc kubenswrapper[4772]: E0320 10:57:52.641536 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:52 crc kubenswrapper[4772]: E0320 10:57:52.641632 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:52 crc kubenswrapper[4772]: I0320 10:57:52.641703 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:52 crc kubenswrapper[4772]: E0320 10:57:52.641968 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:52 crc kubenswrapper[4772]: E0320 10:57:52.641709 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:53 crc kubenswrapper[4772]: I0320 10:57:53.642127 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 10:57:53 crc kubenswrapper[4772]: E0320 10:57:53.642394 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:57:54 crc kubenswrapper[4772]: I0320 10:57:54.641916 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:54 crc kubenswrapper[4772]: I0320 10:57:54.642018 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:54 crc kubenswrapper[4772]: I0320 10:57:54.641919 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:54 crc kubenswrapper[4772]: E0320 10:57:54.644040 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:54 crc kubenswrapper[4772]: I0320 10:57:54.644132 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:54 crc kubenswrapper[4772]: E0320 10:57:54.644336 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:54 crc kubenswrapper[4772]: E0320 10:57:54.644498 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:54 crc kubenswrapper[4772]: E0320 10:57:54.644624 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:54 crc kubenswrapper[4772]: E0320 10:57:54.782187 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:57:56 crc kubenswrapper[4772]: I0320 10:57:56.175084 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:56 crc kubenswrapper[4772]: E0320 10:57:56.175283 4772 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:56 crc kubenswrapper[4772]: E0320 10:57:56.175754 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs podName:2ac5550b-02eb-48b4-b62a-e21dd4429249 nodeName:}" failed. No retries permitted until 2026-03-20 10:59:00.17572067 +0000 UTC m=+226.266687195 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs") pod "network-metrics-daemon-m8kjd" (UID: "2ac5550b-02eb-48b4-b62a-e21dd4429249") : object "openshift-multus"/"metrics-daemon-secret" not registered Mar 20 10:57:56 crc kubenswrapper[4772]: I0320 10:57:56.641278 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:56 crc kubenswrapper[4772]: I0320 10:57:56.641332 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:56 crc kubenswrapper[4772]: E0320 10:57:56.641484 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:56 crc kubenswrapper[4772]: I0320 10:57:56.641617 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:56 crc kubenswrapper[4772]: I0320 10:57:56.641731 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:56 crc kubenswrapper[4772]: E0320 10:57:56.641915 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:56 crc kubenswrapper[4772]: E0320 10:57:56.642194 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:56 crc kubenswrapper[4772]: E0320 10:57:56.642541 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:56 crc kubenswrapper[4772]: I0320 10:57:56.667864 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Mar 20 10:57:58 crc kubenswrapper[4772]: I0320 10:57:58.641923 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:57:58 crc kubenswrapper[4772]: I0320 10:57:58.642010 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:57:58 crc kubenswrapper[4772]: E0320 10:57:58.642160 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:57:58 crc kubenswrapper[4772]: I0320 10:57:58.642185 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:57:58 crc kubenswrapper[4772]: I0320 10:57:58.642284 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:57:58 crc kubenswrapper[4772]: E0320 10:57:58.642406 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:57:58 crc kubenswrapper[4772]: E0320 10:57:58.642564 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:57:58 crc kubenswrapper[4772]: E0320 10:57:58.642755 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:57:59 crc kubenswrapper[4772]: E0320 10:57:59.783368 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:00 crc kubenswrapper[4772]: I0320 10:58:00.641918 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:00 crc kubenswrapper[4772]: I0320 10:58:00.642015 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:00 crc kubenswrapper[4772]: E0320 10:58:00.642047 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:00 crc kubenswrapper[4772]: E0320 10:58:00.642196 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:00 crc kubenswrapper[4772]: I0320 10:58:00.642228 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:00 crc kubenswrapper[4772]: E0320 10:58:00.642324 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:00 crc kubenswrapper[4772]: I0320 10:58:00.642608 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:00 crc kubenswrapper[4772]: E0320 10:58:00.642824 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:02 crc kubenswrapper[4772]: I0320 10:58:02.641285 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:02 crc kubenswrapper[4772]: I0320 10:58:02.641352 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:02 crc kubenswrapper[4772]: E0320 10:58:02.641513 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:02 crc kubenswrapper[4772]: E0320 10:58:02.641686 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:02 crc kubenswrapper[4772]: I0320 10:58:02.642012 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:02 crc kubenswrapper[4772]: I0320 10:58:02.642057 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:02 crc kubenswrapper[4772]: E0320 10:58:02.642179 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:02 crc kubenswrapper[4772]: E0320 10:58:02.642321 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:04 crc kubenswrapper[4772]: I0320 10:58:04.640937 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:04 crc kubenswrapper[4772]: I0320 10:58:04.641002 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:04 crc kubenswrapper[4772]: I0320 10:58:04.641077 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:04 crc kubenswrapper[4772]: E0320 10:58:04.643602 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:04 crc kubenswrapper[4772]: I0320 10:58:04.643664 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:04 crc kubenswrapper[4772]: E0320 10:58:04.643934 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:04 crc kubenswrapper[4772]: E0320 10:58:04.644085 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:04 crc kubenswrapper[4772]: E0320 10:58:04.644190 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:04 crc kubenswrapper[4772]: I0320 10:58:04.665818 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=8.665787348 podStartE2EDuration="8.665787348s" podCreationTimestamp="2026-03-20 10:57:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:04.663584626 +0000 UTC m=+170.754551181" watchObservedRunningTime="2026-03-20 10:58:04.665787348 +0000 UTC m=+170.756753883" Mar 20 10:58:04 crc kubenswrapper[4772]: E0320 10:58:04.784628 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:05 crc kubenswrapper[4772]: I0320 10:58:05.642319 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 10:58:05 crc kubenswrapper[4772]: E0320 10:58:05.642789 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:58:06 crc kubenswrapper[4772]: I0320 10:58:06.641552 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:06 crc kubenswrapper[4772]: I0320 10:58:06.641603 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:06 crc kubenswrapper[4772]: I0320 10:58:06.641812 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:06 crc kubenswrapper[4772]: E0320 10:58:06.642028 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:06 crc kubenswrapper[4772]: I0320 10:58:06.642049 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:06 crc kubenswrapper[4772]: E0320 10:58:06.642211 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:06 crc kubenswrapper[4772]: E0320 10:58:06.642343 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:06 crc kubenswrapper[4772]: E0320 10:58:06.642436 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:08 crc kubenswrapper[4772]: I0320 10:58:08.641291 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:08 crc kubenswrapper[4772]: I0320 10:58:08.641428 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:08 crc kubenswrapper[4772]: I0320 10:58:08.641421 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:08 crc kubenswrapper[4772]: I0320 10:58:08.641422 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:08 crc kubenswrapper[4772]: E0320 10:58:08.641556 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:08 crc kubenswrapper[4772]: E0320 10:58:08.641812 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:08 crc kubenswrapper[4772]: E0320 10:58:08.641924 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:08 crc kubenswrapper[4772]: E0320 10:58:08.642019 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:09 crc kubenswrapper[4772]: E0320 10:58:09.786099 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:10 crc kubenswrapper[4772]: I0320 10:58:10.641432 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:10 crc kubenswrapper[4772]: I0320 10:58:10.641518 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:10 crc kubenswrapper[4772]: I0320 10:58:10.641454 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:10 crc kubenswrapper[4772]: I0320 10:58:10.641617 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:10 crc kubenswrapper[4772]: E0320 10:58:10.641834 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:10 crc kubenswrapper[4772]: E0320 10:58:10.642072 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:10 crc kubenswrapper[4772]: E0320 10:58:10.642269 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:10 crc kubenswrapper[4772]: E0320 10:58:10.642451 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.587817 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/1.log" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.589089 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/0.log" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.589146 4772 generic.go:334] "Generic (PLEG): container finished" podID="a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d" containerID="b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003" exitCode=1 Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.589191 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerDied","Data":"b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003"} Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.589264 4772 scope.go:117] "RemoveContainer" containerID="de94894a0aebfcbc018b8a201edb2fd4cad2555a67f3b7fd8cee054ecc136a66" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.589874 4772 scope.go:117] "RemoveContainer" containerID="b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003" Mar 20 10:58:12 crc kubenswrapper[4772]: E0320 10:58:12.590539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-7fpq9_openshift-multus(a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d)\"" pod="openshift-multus/multus-7fpq9" podUID="a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.641382 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.641456 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.641522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:12 crc kubenswrapper[4772]: E0320 10:58:12.641577 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:12 crc kubenswrapper[4772]: I0320 10:58:12.641403 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:12 crc kubenswrapper[4772]: E0320 10:58:12.641786 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:12 crc kubenswrapper[4772]: E0320 10:58:12.641958 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:12 crc kubenswrapper[4772]: E0320 10:58:12.642063 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:13 crc kubenswrapper[4772]: I0320 10:58:13.595319 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/1.log" Mar 20 10:58:14 crc kubenswrapper[4772]: I0320 10:58:14.641167 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:14 crc kubenswrapper[4772]: I0320 10:58:14.641222 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:14 crc kubenswrapper[4772]: I0320 10:58:14.641251 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:14 crc kubenswrapper[4772]: E0320 10:58:14.646035 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:14 crc kubenswrapper[4772]: I0320 10:58:14.646469 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:14 crc kubenswrapper[4772]: E0320 10:58:14.646690 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:14 crc kubenswrapper[4772]: E0320 10:58:14.647159 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:14 crc kubenswrapper[4772]: E0320 10:58:14.647578 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:14 crc kubenswrapper[4772]: E0320 10:58:14.787275 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:16 crc kubenswrapper[4772]: I0320 10:58:16.641424 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:16 crc kubenswrapper[4772]: E0320 10:58:16.641636 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:16 crc kubenswrapper[4772]: I0320 10:58:16.641664 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:16 crc kubenswrapper[4772]: I0320 10:58:16.641439 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:16 crc kubenswrapper[4772]: I0320 10:58:16.641433 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:16 crc kubenswrapper[4772]: E0320 10:58:16.642242 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:16 crc kubenswrapper[4772]: E0320 10:58:16.642674 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:16 crc kubenswrapper[4772]: I0320 10:58:16.642782 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 10:58:16 crc kubenswrapper[4772]: E0320 10:58:16.643039 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:16 crc kubenswrapper[4772]: E0320 10:58:16.643056 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-z8p9x_openshift-ovn-kubernetes(d62da04c-5422-4320-9352-8959b89501be)\"" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" Mar 20 10:58:18 crc kubenswrapper[4772]: I0320 10:58:18.642095 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:18 crc kubenswrapper[4772]: E0320 10:58:18.642340 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:18 crc kubenswrapper[4772]: I0320 10:58:18.642830 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:18 crc kubenswrapper[4772]: E0320 10:58:18.643037 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:18 crc kubenswrapper[4772]: I0320 10:58:18.643370 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:18 crc kubenswrapper[4772]: I0320 10:58:18.643383 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:18 crc kubenswrapper[4772]: E0320 10:58:18.643540 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:18 crc kubenswrapper[4772]: E0320 10:58:18.643651 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:19 crc kubenswrapper[4772]: E0320 10:58:19.788808 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:20 crc kubenswrapper[4772]: I0320 10:58:20.641900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:20 crc kubenswrapper[4772]: I0320 10:58:20.641954 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:20 crc kubenswrapper[4772]: I0320 10:58:20.642040 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:20 crc kubenswrapper[4772]: I0320 10:58:20.641900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:20 crc kubenswrapper[4772]: E0320 10:58:20.642123 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:20 crc kubenswrapper[4772]: E0320 10:58:20.642248 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:20 crc kubenswrapper[4772]: E0320 10:58:20.642340 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:20 crc kubenswrapper[4772]: E0320 10:58:20.642460 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:22 crc kubenswrapper[4772]: I0320 10:58:22.641298 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:22 crc kubenswrapper[4772]: I0320 10:58:22.641436 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:22 crc kubenswrapper[4772]: I0320 10:58:22.641435 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:22 crc kubenswrapper[4772]: I0320 10:58:22.641355 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:22 crc kubenswrapper[4772]: E0320 10:58:22.641579 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:22 crc kubenswrapper[4772]: E0320 10:58:22.641720 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:22 crc kubenswrapper[4772]: E0320 10:58:22.641828 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:22 crc kubenswrapper[4772]: E0320 10:58:22.641927 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:23 crc kubenswrapper[4772]: I0320 10:58:23.642149 4772 scope.go:117] "RemoveContainer" containerID="b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003" Mar 20 10:58:24 crc kubenswrapper[4772]: I0320 10:58:24.639400 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/1.log" Mar 20 10:58:24 crc kubenswrapper[4772]: I0320 10:58:24.639867 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerStarted","Data":"d695a81a4a5904d8b259c89eb9d85c68d7f8c623f84f0a020356bde5e5f9cbc3"} Mar 20 10:58:24 crc kubenswrapper[4772]: I0320 10:58:24.641674 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:24 crc kubenswrapper[4772]: I0320 10:58:24.641736 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:24 crc kubenswrapper[4772]: E0320 10:58:24.641942 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:24 crc kubenswrapper[4772]: I0320 10:58:24.642037 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:24 crc kubenswrapper[4772]: E0320 10:58:24.642211 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:24 crc kubenswrapper[4772]: E0320 10:58:24.642349 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:24 crc kubenswrapper[4772]: I0320 10:58:24.642522 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:24 crc kubenswrapper[4772]: E0320 10:58:24.642700 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:24 crc kubenswrapper[4772]: E0320 10:58:24.790128 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:26 crc kubenswrapper[4772]: I0320 10:58:26.641826 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:26 crc kubenswrapper[4772]: I0320 10:58:26.641930 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:26 crc kubenswrapper[4772]: I0320 10:58:26.642021 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:26 crc kubenswrapper[4772]: I0320 10:58:26.642115 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:26 crc kubenswrapper[4772]: E0320 10:58:26.642099 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:26 crc kubenswrapper[4772]: E0320 10:58:26.642267 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:26 crc kubenswrapper[4772]: E0320 10:58:26.642364 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:26 crc kubenswrapper[4772]: E0320 10:58:26.642522 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:28 crc kubenswrapper[4772]: I0320 10:58:28.642005 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:28 crc kubenswrapper[4772]: I0320 10:58:28.642457 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:28 crc kubenswrapper[4772]: I0320 10:58:28.642461 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:28 crc kubenswrapper[4772]: E0320 10:58:28.642645 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:28 crc kubenswrapper[4772]: I0320 10:58:28.642658 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:28 crc kubenswrapper[4772]: E0320 10:58:28.642765 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:28 crc kubenswrapper[4772]: E0320 10:58:28.642967 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:28 crc kubenswrapper[4772]: E0320 10:58:28.643031 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:29 crc kubenswrapper[4772]: E0320 10:58:29.792412 4772 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Mar 20 10:58:30 crc kubenswrapper[4772]: I0320 10:58:30.641608 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:30 crc kubenswrapper[4772]: I0320 10:58:30.641682 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:30 crc kubenswrapper[4772]: I0320 10:58:30.641650 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:30 crc kubenswrapper[4772]: I0320 10:58:30.641700 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:30 crc kubenswrapper[4772]: E0320 10:58:30.641963 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:30 crc kubenswrapper[4772]: E0320 10:58:30.642124 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:30 crc kubenswrapper[4772]: E0320 10:58:30.642416 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:30 crc kubenswrapper[4772]: E0320 10:58:30.642512 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:31 crc kubenswrapper[4772]: I0320 10:58:31.641697 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.428442 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m8kjd"] Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.428892 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:32 crc kubenswrapper[4772]: E0320 10:58:32.428976 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.641017 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.641017 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.641134 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:32 crc kubenswrapper[4772]: E0320 10:58:32.641262 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:32 crc kubenswrapper[4772]: E0320 10:58:32.641463 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:32 crc kubenswrapper[4772]: E0320 10:58:32.641518 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.669075 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/3.log" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.671085 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerStarted","Data":"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c"} Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.671473 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:58:32 crc kubenswrapper[4772]: I0320 10:58:32.695310 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podStartSLOduration=155.695291569 podStartE2EDuration="2m35.695291569s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:32.694595099 +0000 UTC m=+198.785561594" watchObservedRunningTime="2026-03-20 10:58:32.695291569 +0000 UTC m=+198.786258044" Mar 20 10:58:34 crc kubenswrapper[4772]: I0320 10:58:34.641129 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:34 crc kubenswrapper[4772]: E0320 10:58:34.642978 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Mar 20 10:58:34 crc kubenswrapper[4772]: I0320 10:58:34.643009 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:34 crc kubenswrapper[4772]: I0320 10:58:34.643139 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:34 crc kubenswrapper[4772]: E0320 10:58:34.643237 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-m8kjd" podUID="2ac5550b-02eb-48b4-b62a-e21dd4429249" Mar 20 10:58:34 crc kubenswrapper[4772]: E0320 10:58:34.643384 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Mar 20 10:58:34 crc kubenswrapper[4772]: I0320 10:58:34.643584 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:34 crc kubenswrapper[4772]: E0320 10:58:34.643999 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.641320 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.641478 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.642117 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.642400 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.643953 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.645644 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.646046 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.646620 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.647418 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 10:58:36 crc kubenswrapper[4772]: I0320 10:58:36.649741 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.216560 4772 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.309629 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-drz9m"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.311093 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.312988 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.313635 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.316128 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rq497"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.316778 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.319723 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.320430 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.320996 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.321104 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.321302 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.321624 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.324942 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.325579 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.326519 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s7p9n"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.327165 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.327341 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.328938 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.329743 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.337747 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.338036 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.338375 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lvstj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.339064 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.339499 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.340695 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.342192 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.342593 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.342819 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.343239 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.343432 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.343948 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.346771 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.347097 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.347354 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.347577 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.347658 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.347898 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348048 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348182 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348294 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348368 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348500 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348568 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348698 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348820 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.348925 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.349236 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-fgwgm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.353533 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-lg2z9"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.353954 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.354261 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.355235 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.363821 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.363935 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.364076 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.364923 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.365158 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.365273 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.365373 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.365516 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.365657 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.365865 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.366060 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.366426 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.366572 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.368243 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5r5fg"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.368968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.378125 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mktq6"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.379339 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.379328 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.379920 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.380710 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.383780 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.383864 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.383929 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.383963 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.384048 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.385898 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.408046 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.410030 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.410771 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m6mzp"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.411344 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xd664"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.411413 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.411718 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.411952 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.412046 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.412195 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.412553 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.412687 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.412960 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.413201 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.413473 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414174 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-ca\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414210 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec8f3e81-ffd3-493f-93e4-d00e371c923c-serving-cert\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414241 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c330bb63-a0e1-4650-b3f9-71e0bf85da61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414264 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-config\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414285 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-service-ca-bundle\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414305 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc1666f-8e18-4c98-8424-01fe5598d275-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414327 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-client\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414360 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-serving-cert\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414381 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-service-ca\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414402 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bc1666f-8e18-4c98-8424-01fe5598d275-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414427 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414559 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414583 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbae76fd-5602-4144-907a-5aeaafbedee7-config\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414604 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbae76fd-5602-4144-907a-5aeaafbedee7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414626 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414650 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7mk8m\" (UniqueName: \"kubernetes.io/projected/6872c5d1-0892-4abc-9c68-5fe459ed1107-kube-api-access-7mk8m\") pod \"downloads-7954f5f757-lg2z9\" (UID: \"6872c5d1-0892-4abc-9c68-5fe459ed1107\") " pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414674 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vpxsf\" (UniqueName: \"kubernetes.io/projected/1ede39ac-a466-4925-8a5a-1dd6679b1915-kube-api-access-vpxsf\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414696 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v97vs\" (UniqueName: \"kubernetes.io/projected/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-kube-api-access-v97vs\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414720 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vf7zg\" (UniqueName: \"kubernetes.io/projected/f7c20397-4233-45e6-a7f9-5e88942e7abf-kube-api-access-vf7zg\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414746 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/908b84fc-c766-408a-905f-79ddf440ba2b-config\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414767 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf2c75a2-ca6a-415f-80ea-830f55899119-audit-dir\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414789 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-client-ca\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414810 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-dir\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414832 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414915 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-images\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414937 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5fw5r\" (UniqueName: \"kubernetes.io/projected/c330bb63-a0e1-4650-b3f9-71e0bf85da61-kube-api-access-5fw5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414967 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5752755-110f-47d1-b9cf-ff3e35aabf8f-metrics-tls\") pod \"dns-operator-744455d44c-m6mzp\" (UID: \"a5752755-110f-47d1-b9cf-ff3e35aabf8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-audit-policies\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415011 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xxmn8\" (UniqueName: \"kubernetes.io/projected/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-kube-api-access-xxmn8\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415034 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415058 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cdae4fbe-72ef-47c6-a521-120230421079-available-featuregates\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415082 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-oauth-config\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415136 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415158 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbae76fd-5602-4144-907a-5aeaafbedee7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.414041 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415027 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415335 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-config\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415379 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede39ac-a466-4925-8a5a-1dd6679b1915-serving-cert\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415411 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415568 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-etcd-client\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415730 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7gtk\" (UniqueName: \"kubernetes.io/projected/ec8f3e81-ffd3-493f-93e4-d00e371c923c-kube-api-access-m7gtk\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415788 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5mdrp\" (UniqueName: \"kubernetes.io/projected/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-kube-api-access-5mdrp\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.415869 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-serving-cert\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416053 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-policies\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416095 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c330bb63-a0e1-4650-b3f9-71e0bf85da61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416121 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416363 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416131 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qk4q\" (UniqueName: \"kubernetes.io/projected/908b84fc-c766-408a-905f-79ddf440ba2b-kube-api-access-2qk4q\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416422 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416438 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-serving-cert\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416481 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416549 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-service-ca\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416585 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j6cwn\" (UniqueName: \"kubernetes.io/projected/4bc1666f-8e18-4c98-8424-01fe5598d275-kube-api-access-j6cwn\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416448 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416621 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lgtbc\" (UniqueName: \"kubernetes.io/projected/a5752755-110f-47d1-b9cf-ff3e35aabf8f-kube-api-access-lgtbc\") pod \"dns-operator-744455d44c-m6mzp\" (UID: \"a5752755-110f-47d1-b9cf-ff3e35aabf8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416633 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416654 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-encryption-config\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416688 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416575 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416734 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-config\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416766 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/908b84fc-c766-408a-905f-79ddf440ba2b-auth-proxy-config\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416800 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/908b84fc-c766-408a-905f-79ddf440ba2b-machine-approver-tls\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416871 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416835 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-oauth-serving-cert\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416933 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdae4fbe-72ef-47c6-a521-120230421079-serving-cert\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.416987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-trusted-ca-bundle\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417047 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417102 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417167 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-config\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417286 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b4pcp\" (UniqueName: \"kubernetes.io/projected/cdae4fbe-72ef-47c6-a521-120230421079-kube-api-access-b4pcp\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417383 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-config\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417440 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.417517 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v2cd\" (UniqueName: \"kubernetes.io/projected/bf2c75a2-ca6a-415f-80ea-830f55899119-kube-api-access-4v2cd\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.418089 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.418741 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.418914 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.418966 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.419319 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.419464 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.420106 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.421140 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.421335 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.421976 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.422000 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.422067 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.422137 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.422154 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.422193 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.421362 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.422980 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zlklv"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.423602 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.424326 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.424683 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.424701 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.425610 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.425959 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.426009 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.426202 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.426900 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.427091 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.427210 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.427407 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.427551 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.427594 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.428422 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.429309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.430535 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.434029 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.434611 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.434697 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.435793 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.444177 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.444653 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.445134 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.447503 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.451246 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.451342 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.451439 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.452720 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.453139 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.453203 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.453413 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.454114 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.457468 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.458686 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.462636 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-9drnj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.483194 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.484426 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.484481 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.485633 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6fhz7"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.486115 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.486467 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.486876 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.487421 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.488468 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.488577 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.488785 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.488917 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m22rd"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.489581 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.488921 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.491860 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.491951 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.492535 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.492679 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.492788 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.493051 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.493618 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zlsr2"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.494082 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.495107 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566738-zlqcq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.495487 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.495876 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.496355 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.499637 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.499714 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.500350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.500713 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.501537 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.506188 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-drz9m"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.506242 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s7p9n"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.508169 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-p564j"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.508883 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.511595 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.512772 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rq497"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.513027 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.515032 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lvstj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.516702 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5r5fg"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518032 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fgwgm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518172 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518212 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-etcd-client\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7gtk\" (UniqueName: \"kubernetes.io/projected/ec8f3e81-ffd3-493f-93e4-d00e371c923c-kube-api-access-m7gtk\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518263 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5mdrp\" (UniqueName: \"kubernetes.io/projected/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-kube-api-access-5mdrp\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518288 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-serving-cert\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518309 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-policies\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518329 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c330bb63-a0e1-4650-b3f9-71e0bf85da61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518348 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2qk4q\" (UniqueName: \"kubernetes.io/projected/908b84fc-c766-408a-905f-79ddf440ba2b-kube-api-access-2qk4q\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518369 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-serving-cert\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518394 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518416 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-service-ca\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518439 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j6cwn\" (UniqueName: \"kubernetes.io/projected/4bc1666f-8e18-4c98-8424-01fe5598d275-kube-api-access-j6cwn\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518467 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lgtbc\" (UniqueName: \"kubernetes.io/projected/a5752755-110f-47d1-b9cf-ff3e35aabf8f-kube-api-access-lgtbc\") pod \"dns-operator-744455d44c-m6mzp\" (UID: \"a5752755-110f-47d1-b9cf-ff3e35aabf8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518488 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-encryption-config\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518511 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-config\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518558 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/908b84fc-c766-408a-905f-79ddf440ba2b-auth-proxy-config\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518577 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/908b84fc-c766-408a-905f-79ddf440ba2b-machine-approver-tls\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518595 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-oauth-serving-cert\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518616 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdae4fbe-72ef-47c6-a521-120230421079-serving-cert\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518638 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-trusted-ca-bundle\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518678 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518698 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-config\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518728 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b4pcp\" (UniqueName: \"kubernetes.io/projected/cdae4fbe-72ef-47c6-a521-120230421079-kube-api-access-b4pcp\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518756 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-config\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518781 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518801 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v2cd\" (UniqueName: \"kubernetes.io/projected/bf2c75a2-ca6a-415f-80ea-830f55899119-kube-api-access-4v2cd\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518823 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-ca\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518863 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec8f3e81-ffd3-493f-93e4-d00e371c923c-serving-cert\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c330bb63-a0e1-4650-b3f9-71e0bf85da61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518907 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-config\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-service-ca-bundle\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518972 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc1666f-8e18-4c98-8424-01fe5598d275-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.518995 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-client\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519032 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-serving-cert\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519055 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-service-ca\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519078 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bc1666f-8e18-4c98-8424-01fe5598d275-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519098 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519107 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-policies\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519117 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519181 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbae76fd-5602-4144-907a-5aeaafbedee7-config\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519207 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbae76fd-5602-4144-907a-5aeaafbedee7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519232 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519593 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c330bb63-a0e1-4650-b3f9-71e0bf85da61-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519761 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-service-ca-bundle\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.519937 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520062 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7mk8m\" (UniqueName: \"kubernetes.io/projected/6872c5d1-0892-4abc-9c68-5fe459ed1107-kube-api-access-7mk8m\") pod \"downloads-7954f5f757-lg2z9\" (UID: \"6872c5d1-0892-4abc-9c68-5fe459ed1107\") " pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520154 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vpxsf\" (UniqueName: \"kubernetes.io/projected/1ede39ac-a466-4925-8a5a-1dd6679b1915-kube-api-access-vpxsf\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520179 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v97vs\" (UniqueName: \"kubernetes.io/projected/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-kube-api-access-v97vs\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520206 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vf7zg\" (UniqueName: \"kubernetes.io/projected/f7c20397-4233-45e6-a7f9-5e88942e7abf-kube-api-access-vf7zg\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520237 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/908b84fc-c766-408a-905f-79ddf440ba2b-config\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520261 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf2c75a2-ca6a-415f-80ea-830f55899119-audit-dir\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520286 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-client-ca\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520308 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-dir\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520329 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520353 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520373 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-images\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520395 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5fw5r\" (UniqueName: \"kubernetes.io/projected/c330bb63-a0e1-4650-b3f9-71e0bf85da61-kube-api-access-5fw5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520416 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5752755-110f-47d1-b9cf-ff3e35aabf8f-metrics-tls\") pod \"dns-operator-744455d44c-m6mzp\" (UID: \"a5752755-110f-47d1-b9cf-ff3e35aabf8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520439 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-audit-policies\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520464 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xxmn8\" (UniqueName: \"kubernetes.io/projected/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-kube-api-access-xxmn8\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520465 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-service-ca\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520491 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520520 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cdae4fbe-72ef-47c6-a521-120230421079-available-featuregates\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520582 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-oauth-config\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520626 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbae76fd-5602-4144-907a-5aeaafbedee7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520676 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-config\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.520710 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede39ac-a466-4925-8a5a-1dd6679b1915-serving-cert\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.522095 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4bc1666f-8e18-4c98-8424-01fe5598d275-config\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.523576 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-encryption-config\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.524078 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-ca\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.524540 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-serving-cert\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.525930 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.527103 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-etcd-client\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.527608 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ec8f3e81-ffd3-493f-93e4-d00e371c923c-serving-cert\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.528030 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/bf2c75a2-ca6a-415f-80ea-830f55899119-etcd-client\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.528739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.529502 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.529533 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/cbae76fd-5602-4144-907a-5aeaafbedee7-config\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.530132 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.530727 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c330bb63-a0e1-4650-b3f9-71e0bf85da61-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.530782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a5752755-110f-47d1-b9cf-ff3e35aabf8f-metrics-tls\") pod \"dns-operator-744455d44c-m6mzp\" (UID: \"a5752755-110f-47d1-b9cf-ff3e35aabf8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.531030 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f7txt"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.531329 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-audit-policies\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.531998 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/bf2c75a2-ca6a-415f-80ea-830f55899119-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.532449 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.532510 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/bf2c75a2-ca6a-415f-80ea-830f55899119-audit-dir\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.532930 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.533030 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m6mzp"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.533133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.533232 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.533311 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/908b84fc-c766-408a-905f-79ddf440ba2b-config\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.533523 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.533857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/cdae4fbe-72ef-47c6-a521-120230421079-available-featuregates\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.534352 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-client-ca\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.534406 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-dir\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.534584 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-images\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.535542 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ec8f3e81-ffd3-493f-93e4-d00e371c923c-config\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.536743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-config\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.536777 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/908b84fc-c766-408a-905f-79ddf440ba2b-auth-proxy-config\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.537617 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-config\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.537726 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-serving-cert\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.537908 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.538209 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.538949 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/908b84fc-c766-408a-905f-79ddf440ba2b-machine-approver-tls\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.539132 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-oauth-serving-cert\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.540029 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cdae4fbe-72ef-47c6-a521-120230421079-serving-cert\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.540404 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-config\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.541243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-service-ca\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.543250 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-config\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.543636 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-trusted-ca-bundle\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.545706 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.546561 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4bc1666f-8e18-4c98-8424-01fe5598d275-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.547535 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede39ac-a466-4925-8a5a-1dd6679b1915-serving-cert\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.548275 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-serving-cert\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.549693 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.549922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.550681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.551589 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/cbae76fd-5602-4144-907a-5aeaafbedee7-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.553117 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.554401 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-oauth-config\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.556078 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.556449 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zlklv"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.559927 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.560320 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.561545 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mktq6"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.562583 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.563645 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.564252 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.564373 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.564859 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.565936 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6fhz7"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.567273 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.570419 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xd664"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.574183 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.578163 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.579601 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.582017 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.584072 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.585715 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.587874 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m22rd"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.589597 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.591228 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.592980 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.596111 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.597231 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zlsr2"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.598405 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-zlqcq"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.599814 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lg2z9"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.601446 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f7txt"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.602756 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-g9kg7"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.603534 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.605020 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-p564j"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.605579 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.607118 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-2s9dj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.607639 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.608333 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2s9dj"] Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.614651 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.633269 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.653804 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.673964 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.693406 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.714166 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.733668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.757547 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.785453 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.794205 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.813360 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.834478 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.863767 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.875348 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.902129 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.913588 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.933960 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.954059 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.978247 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.986230 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 10:58:39 crc kubenswrapper[4772]: I0320 10:58:39.994053 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.013896 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.034054 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.053357 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.074987 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.094087 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.113950 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.133574 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.153528 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.174287 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.194570 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.214180 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.233816 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.254034 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.273833 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.294783 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.313658 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.333765 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.354363 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.373140 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.393894 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.421016 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.434802 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.453720 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.474574 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.492514 4772 request.go:700] Waited for 1.004673894s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-scheduler-operator/secrets?fieldSelector=metadata.name%3Dopenshift-kube-scheduler-operator-dockercfg-qt55r&limit=500&resourceVersion=0 Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.493948 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.513623 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.533705 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.554160 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.573941 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.594055 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.614325 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.634255 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.653943 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.673716 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.694374 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.714184 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.734435 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.765172 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.775950 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.794638 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.814286 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.833590 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.854191 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.873536 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.895615 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.913888 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.933048 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.953819 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.973656 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 10:58:40 crc kubenswrapper[4772]: I0320 10:58:40.994786 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.014002 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.034639 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.053786 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.074277 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.093739 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.114562 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.134082 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.171479 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7gtk\" (UniqueName: \"kubernetes.io/projected/ec8f3e81-ffd3-493f-93e4-d00e371c923c-kube-api-access-m7gtk\") pod \"authentication-operator-69f744f599-mktq6\" (UID: \"ec8f3e81-ffd3-493f-93e4-d00e371c923c\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.199447 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5mdrp\" (UniqueName: \"kubernetes.io/projected/b19a40a6-c2c9-47ec-8da6-bd23833c5a4a-kube-api-access-5mdrp\") pod \"etcd-operator-b45778765-5r5fg\" (UID: \"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a\") " pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.208722 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j6cwn\" (UniqueName: \"kubernetes.io/projected/4bc1666f-8e18-4c98-8424-01fe5598d275-kube-api-access-j6cwn\") pod \"openshift-apiserver-operator-796bbdcf4f-dtdpj\" (UID: \"4bc1666f-8e18-4c98-8424-01fe5598d275\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.229135 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2qk4q\" (UniqueName: \"kubernetes.io/projected/908b84fc-c766-408a-905f-79ddf440ba2b-kube-api-access-2qk4q\") pod \"machine-approver-56656f9798-d2cpm\" (UID: \"908b84fc-c766-408a-905f-79ddf440ba2b\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.234808 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.235270 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-7xktg"] Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.239004 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7xktg"] Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.239130 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.264528 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lgtbc\" (UniqueName: \"kubernetes.io/projected/a5752755-110f-47d1-b9cf-ff3e35aabf8f-kube-api-access-lgtbc\") pod \"dns-operator-744455d44c-m6mzp\" (UID: \"a5752755-110f-47d1-b9cf-ff3e35aabf8f\") " pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.301136 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/cbae76fd-5602-4144-907a-5aeaafbedee7-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-5fk2x\" (UID: \"cbae76fd-5602-4144-907a-5aeaafbedee7\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.301361 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5fw5r\" (UniqueName: \"kubernetes.io/projected/c330bb63-a0e1-4650-b3f9-71e0bf85da61-kube-api-access-5fw5r\") pod \"openshift-controller-manager-operator-756b6f6bc6-8ftlb\" (UID: \"c330bb63-a0e1-4650-b3f9-71e0bf85da61\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.313079 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.321910 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xxmn8\" (UniqueName: \"kubernetes.io/projected/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-kube-api-access-xxmn8\") pod \"oauth-openshift-558db77b4-rq497\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.339116 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7mk8m\" (UniqueName: \"kubernetes.io/projected/6872c5d1-0892-4abc-9c68-5fe459ed1107-kube-api-access-7mk8m\") pod \"downloads-7954f5f757-lg2z9\" (UID: \"6872c5d1-0892-4abc-9c68-5fe459ed1107\") " pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.357624 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vpxsf\" (UniqueName: \"kubernetes.io/projected/1ede39ac-a466-4925-8a5a-1dd6679b1915-kube-api-access-vpxsf\") pod \"route-controller-manager-6576b87f9c-schhc\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.374069 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.381127 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v97vs\" (UniqueName: \"kubernetes.io/projected/45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6-kube-api-access-v97vs\") pod \"machine-api-operator-5694c8668f-lvstj\" (UID: \"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.382288 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.389818 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.393416 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.397373 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vf7zg\" (UniqueName: \"kubernetes.io/projected/f7c20397-4233-45e6-a7f9-5e88942e7abf-kube-api-access-vf7zg\") pod \"console-f9d7485db-fgwgm\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.399576 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.406150 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.414009 4772 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.420367 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.434790 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.452826 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj"] Mar 20 10:58:41 crc kubenswrapper[4772]: W0320 10:58:41.469555 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4bc1666f_8e18_4c98_8424_01fe5598d275.slice/crio-760b3785082d3a7551e2a77b6d9ac1c75fa9a9e77b628f669f2ab56e446f114c WatchSource:0}: Error finding container 760b3785082d3a7551e2a77b6d9ac1c75fa9a9e77b628f669f2ab56e446f114c: Status 404 returned error can't find the container with id 760b3785082d3a7551e2a77b6d9ac1c75fa9a9e77b628f669f2ab56e446f114c Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.470415 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v2cd\" (UniqueName: \"kubernetes.io/projected/bf2c75a2-ca6a-415f-80ea-830f55899119-kube-api-access-4v2cd\") pod \"apiserver-7bbb656c7d-fb454\" (UID: \"bf2c75a2-ca6a-415f-80ea-830f55899119\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.491445 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b4pcp\" (UniqueName: \"kubernetes.io/projected/cdae4fbe-72ef-47c6-a521-120230421079-kube-api-access-b4pcp\") pod \"openshift-config-operator-7777fb866f-drz9m\" (UID: \"cdae4fbe-72ef-47c6-a521-120230421079\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.512567 4772 request.go:700] Waited for 1.917888959s due to client-side throttling, not priority and fairness, request: PATCH:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-controller-manager-operator/pods/openshift-controller-manager-operator-756b6f6bc6-8ftlb/status Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.514524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.517076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.534692 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547454 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-audit\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547524 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-image-import-ca\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547572 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547603 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-serving-cert\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547631 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13ebbb5a-3355-4661-92f7-651afafe19e1-audit-dir\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547656 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-config\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547677 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-etcd-client\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547703 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-certificates\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547725 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-bound-sa-token\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547745 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jtfzm\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-kube-api-access-jtfzm\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa1a0fe0-24cb-4a49-9c1c-9624889ccf31-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zbcmq\" (UID: \"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547794 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-trusted-ca\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547830 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-encryption-config\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547875 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-etcd-serving-ca\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547898 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twbvs\" (UniqueName: \"kubernetes.io/projected/13ebbb5a-3355-4661-92f7-651afafe19e1-kube-api-access-twbvs\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547921 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca766d-44d0-4433-b2f8-6348e66ee047-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547955 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13ebbb5a-3355-4661-92f7-651afafe19e1-node-pullsecrets\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.547977 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-tls\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.548002 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca766d-44d0-4433-b2f8-6348e66ee047-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.548028 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zvrqj\" (UniqueName: \"kubernetes.io/projected/aa1a0fe0-24cb-4a49-9c1c-9624889ccf31-kube-api-access-zvrqj\") pod \"cluster-samples-operator-665b6dd947-zbcmq\" (UID: \"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.548049 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.548422 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.048406557 +0000 UTC m=+208.139373042 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.556859 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.576474 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.593907 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.595572 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.605605 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.613594 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.618974 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.624211 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb"] Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.633205 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.649345 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.649569 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.149527988 +0000 UTC m=+208.240494473 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.649713 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-srv-cert\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.649768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-metrics-certs\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.649799 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9tcpb\" (UniqueName: \"kubernetes.io/projected/ddddd02c-4970-4017-a493-1f9eb50214f3-kube-api-access-9tcpb\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.649883 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-bound-sa-token\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.650135 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa1a0fe0-24cb-4a49-9c1c-9624889ccf31-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zbcmq\" (UID: \"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.650184 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dl6dx\" (UniqueName: \"kubernetes.io/projected/06381439-6997-45aa-8dce-62b012b0ac68-kube-api-access-dl6dx\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.650285 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jwsx\" (UniqueName: \"kubernetes.io/projected/bfa70bc3-d525-49db-94fd-316370428815-kube-api-access-5jwsx\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.650893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mtzl8\" (UniqueName: \"kubernetes.io/projected/81a4f4cd-feb2-4c87-99f7-04202818012f-kube-api-access-mtzl8\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651117 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sjfrb\" (UniqueName: \"kubernetes.io/projected/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-kube-api-access-sjfrb\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651177 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651277 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f920728-8d6a-43d6-989a-4ee1665c76ab-webhook-cert\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651315 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-twbvs\" (UniqueName: \"kubernetes.io/projected/13ebbb5a-3355-4661-92f7-651afafe19e1-kube-api-access-twbvs\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651366 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cec3fd7-c8be-4ea2-b196-6262ab488fac-proxy-tls\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651418 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfa70bc3-d525-49db-94fd-316370428815-proxy-tls\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651517 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d8487f2-85df-4de3-a487-34f79c15ef8a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651546 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-serving-cert\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651568 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5394f70-6289-4ea8-8169-b05a765fccfd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651595 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-stnlj\" (UniqueName: \"kubernetes.io/projected/addc805d-7a3b-48d5-8f51-febb095bf28a-kube-api-access-stnlj\") pod \"package-server-manager-789f6589d5-rjcd2\" (UID: \"addc805d-7a3b-48d5-8f51-febb095bf28a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651614 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa70bc3-d525-49db-94fd-316370428815-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.651876 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-socket-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhlbx\" (UniqueName: \"kubernetes.io/projected/77c87234-b79b-4d2f-8ee3-b14aa050925a-kube-api-access-nhlbx\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652066 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hq2ch\" (UniqueName: \"kubernetes.io/projected/709e108a-3d2e-473b-bff3-28cb1269f598-kube-api-access-hq2ch\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652090 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmvgr\" (UniqueName: \"kubernetes.io/projected/2cec3fd7-c8be-4ea2-b196-6262ab488fac-kube-api-access-nmvgr\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652124 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f4d9edb-87ce-41e1-9cc0-aaf07230ec92-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8l854\" (UID: \"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652159 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652202 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-config\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652279 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd360a95-92b0-4cbc-b73f-87bb4274bff5-serving-cert\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652303 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8487f2-85df-4de3-a487-34f79c15ef8a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652601 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652624 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-trusted-ca\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652679 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652755 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4fhzv\" (UniqueName: \"kubernetes.io/projected/ef57ed62-2f1f-4411-974a-f4cf6c624e25-kube-api-access-4fhzv\") pod \"multus-admission-controller-857f4d67dd-m22rd\" (UID: \"ef57ed62-2f1f-4411-974a-f4cf6c624e25\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-audit\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652817 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b439d8ad-3896-4c92-bd13-e094f9a63b7c-node-bootstrap-token\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652922 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/34a9a96c-7d9a-412b-ac98-76747d89f7ba-srv-cert\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652942 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bb76d\" (UniqueName: \"kubernetes.io/projected/bd360a95-92b0-4cbc-b73f-87bb4274bff5-kube-api-access-bb76d\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652961 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2563e1e5-aeb2-4d41-857f-171e91d41281-signing-key\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.652993 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-image-import-ca\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653028 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653032 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d96629a-14b8-4f25-a58e-65e5eaf8b141-config-volume\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653184 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06381439-6997-45aa-8dce-62b012b0ac68-config-volume\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653216 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653246 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2563e1e5-aeb2-4d41-857f-171e91d41281-signing-cabundle\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653263 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rmx56\" (UniqueName: \"kubernetes.io/projected/2563e1e5-aeb2-4d41-857f-171e91d41281-kube-api-access-rmx56\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653780 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f920728-8d6a-43d6-989a-4ee1665c76ab-tmpfs\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653833 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-serving-cert\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653893 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/709e108a-3d2e-473b-bff3-28cb1269f598-trusted-ca\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653912 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-csi-data-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.653928 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.654164 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/aa1a0fe0-24cb-4a49-9c1c-9624889ccf31-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-zbcmq\" (UID: \"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.654322 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-audit\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.654680 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-trusted-ca-bundle\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.654730 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-etcd-client\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.654754 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ef57ed62-2f1f-4411-974a-f4cf6c624e25-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m22rd\" (UID: \"ef57ed62-2f1f-4411-974a-f4cf6c624e25\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.654774 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-registration-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.655118 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-image-import-ca\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.655245 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.655270 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d96629a-14b8-4f25-a58e-65e5eaf8b141-metrics-tls\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.655295 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42f4b6b-59e0-41f7-a6de-494a458b064b-cert\") pod \"ingress-canary-2s9dj\" (UID: \"c42f4b6b-59e0-41f7-a6de-494a458b064b\") " pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.655526 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-certificates\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656137 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jtfzm\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-kube-api-access-jtfzm\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656181 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06381439-6997-45aa-8dce-62b012b0ac68-secret-volume\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656302 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd360a95-92b0-4cbc-b73f-87bb4274bff5-config\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656369 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fzpt5\" (UniqueName: \"kubernetes.io/projected/0d96629a-14b8-4f25-a58e-65e5eaf8b141-kube-api-access-fzpt5\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656417 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xxr5\" (UniqueName: \"kubernetes.io/projected/8f920728-8d6a-43d6-989a-4ee1665c76ab-kube-api-access-8xxr5\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656434 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-certificates\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656478 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-trusted-ca\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656492 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-serving-cert\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656497 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/34a9a96c-7d9a-412b-ac98-76747d89f7ba-profile-collector-cert\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656570 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-encryption-config\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-etcd-serving-ca\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656858 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656876 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cc2pb\" (UniqueName: \"kubernetes.io/projected/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-kube-api-access-cc2pb\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656906 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/addc805d-7a3b-48d5-8f51-febb095bf28a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rjcd2\" (UID: \"addc805d-7a3b-48d5-8f51-febb095bf28a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.656924 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-serving-cert\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657334 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-etcd-client\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657384 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-etcd-serving-ca\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657487 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrsrb\" (UniqueName: \"kubernetes.io/projected/9f4d9edb-87ce-41e1-9cc0-aaf07230ec92-kube-api-access-hrsrb\") pod \"control-plane-machine-set-operator-78cbb6b69f-8l854\" (UID: \"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657561 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-trusted-ca\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657568 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca766d-44d0-4433-b2f8-6348e66ee047-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657617 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c77bz\" (UniqueName: \"kubernetes.io/projected/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-kube-api-access-c77bz\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657637 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b439d8ad-3896-4c92-bd13-e094f9a63b7c-certs\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.657669 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cec3fd7-c8be-4ea2-b196-6262ab488fac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/709e108a-3d2e-473b-bff3-28cb1269f598-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658103 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d66cw\" (UniqueName: \"kubernetes.io/projected/9309f110-5a80-46ca-b3de-8087048c13e2-kube-api-access-d66cw\") pod \"auto-csr-approver-29566738-zlqcq\" (UID: \"9309f110-5a80-46ca-b3de-8087048c13e2\") " pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658169 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-client-ca\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658202 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tbh48\" (UniqueName: \"kubernetes.io/projected/c42f4b6b-59e0-41f7-a6de-494a458b064b-kube-api-access-tbh48\") pod \"ingress-canary-2s9dj\" (UID: \"c42f4b6b-59e0-41f7-a6de-494a458b064b\") " pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658237 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13ebbb5a-3355-4661-92f7-651afafe19e1-node-pullsecrets\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658301 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/13ebbb5a-3355-4661-92f7-651afafe19e1-node-pullsecrets\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658472 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-tls\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658633 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5394f70-6289-4ea8-8169-b05a765fccfd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658706 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mrxb9\" (UniqueName: \"kubernetes.io/projected/2ce1b289-5291-4da2-afbc-0b5320e730b9-kube-api-access-mrxb9\") pod \"migrator-59844c95c7-mz2wm\" (UID: \"2ce1b289-5291-4da2-afbc-0b5320e730b9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658767 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca766d-44d0-4433-b2f8-6348e66ee047-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658788 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8487f2-85df-4de3-a487-34f79c15ef8a-config\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658947 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zvrqj\" (UniqueName: \"kubernetes.io/projected/aa1a0fe0-24cb-4a49-9c1c-9624889ccf31-kube-api-access-zvrqj\") pod \"cluster-samples-operator-665b6dd947-zbcmq\" (UID: \"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.658998 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x7vn4\" (UniqueName: \"kubernetes.io/projected/34a9a96c-7d9a-412b-ac98-76747d89f7ba-kube-api-access-x7vn4\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659079 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a4f4cd-feb2-4c87-99f7-04202818012f-service-ca-bundle\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659113 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f920728-8d6a-43d6-989a-4ee1665c76ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659162 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-config\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659211 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca766d-44d0-4433-b2f8-6348e66ee047-ca-trust-extracted\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659233 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-plugins-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659605 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/13ebbb5a-3355-4661-92f7-651afafe19e1-encryption-config\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659649 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/709e108a-3d2e-473b-bff3-28cb1269f598-metrics-tls\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659676 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659730 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t8bfk\" (UniqueName: \"kubernetes.io/projected/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-kube-api-access-t8bfk\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659807 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659864 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5394f70-6289-4ea8-8169-b05a765fccfd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659895 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-mountpoint-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659926 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v2pnj\" (UniqueName: \"kubernetes.io/projected/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-kube-api-access-v2pnj\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659942 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-stats-auth\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.659987 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-default-certificate\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.660028 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.660493 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cec3fd7-c8be-4ea2-b196-6262ab488fac-images\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.670910 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca766d-44d0-4433-b2f8-6348e66ee047-installation-pull-secrets\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.671378 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.171239508 +0000 UTC m=+208.262205993 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.672205 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13ebbb5a-3355-4661-92f7-651afafe19e1-audit-dir\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.672247 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-config\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.672370 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nxnz5\" (UniqueName: \"kubernetes.io/projected/b439d8ad-3896-4c92-bd13-e094f9a63b7c-kube-api-access-nxnz5\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.673338 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/13ebbb5a-3355-4661-92f7-651afafe19e1-config\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.673918 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/13ebbb5a-3355-4661-92f7-651afafe19e1-audit-dir\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.678363 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.681585 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-tls\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.694999 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.714024 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" event={"ID":"908b84fc-c766-408a-905f-79ddf440ba2b","Type":"ContainerStarted","Data":"c44584dca56c92918ea2e39cbefd342c5b9344d1aa999a8f369e1418928329ae"} Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.718869 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.722383 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" event={"ID":"4bc1666f-8e18-4c98-8424-01fe5598d275","Type":"ContainerStarted","Data":"760b3785082d3a7551e2a77b6d9ac1c75fa9a9e77b628f669f2ab56e446f114c"} Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.723301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" event={"ID":"c330bb63-a0e1-4650-b3f9-71e0bf85da61","Type":"ContainerStarted","Data":"c9775cbe5a0b5326edc60779c7ed74ed892a51d40157b7e721cd7cdd1ee58175"} Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.753638 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.774243 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.775903 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-bound-sa-token\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.774570 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.27453463 +0000 UTC m=+208.365501125 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776117 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nxnz5\" (UniqueName: \"kubernetes.io/projected/b439d8ad-3896-4c92-bd13-e094f9a63b7c-kube-api-access-nxnz5\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776157 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-srv-cert\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776181 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-metrics-certs\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776208 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9tcpb\" (UniqueName: \"kubernetes.io/projected/ddddd02c-4970-4017-a493-1f9eb50214f3-kube-api-access-9tcpb\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776236 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dl6dx\" (UniqueName: \"kubernetes.io/projected/06381439-6997-45aa-8dce-62b012b0ac68-kube-api-access-dl6dx\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776270 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5jwsx\" (UniqueName: \"kubernetes.io/projected/bfa70bc3-d525-49db-94fd-316370428815-kube-api-access-5jwsx\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776294 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mtzl8\" (UniqueName: \"kubernetes.io/projected/81a4f4cd-feb2-4c87-99f7-04202818012f-kube-api-access-mtzl8\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776316 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sjfrb\" (UniqueName: \"kubernetes.io/projected/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-kube-api-access-sjfrb\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776337 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776359 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f920728-8d6a-43d6-989a-4ee1665c76ab-webhook-cert\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776389 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cec3fd7-c8be-4ea2-b196-6262ab488fac-proxy-tls\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776407 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfa70bc3-d525-49db-94fd-316370428815-proxy-tls\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776433 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d8487f2-85df-4de3-a487-34f79c15ef8a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776458 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-serving-cert\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776483 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5394f70-6289-4ea8-8169-b05a765fccfd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776512 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-stnlj\" (UniqueName: \"kubernetes.io/projected/addc805d-7a3b-48d5-8f51-febb095bf28a-kube-api-access-stnlj\") pod \"package-server-manager-789f6589d5-rjcd2\" (UID: \"addc805d-7a3b-48d5-8f51-febb095bf28a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776535 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa70bc3-d525-49db-94fd-316370428815-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776567 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-socket-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776586 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hq2ch\" (UniqueName: \"kubernetes.io/projected/709e108a-3d2e-473b-bff3-28cb1269f598-kube-api-access-hq2ch\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776605 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmvgr\" (UniqueName: \"kubernetes.io/projected/2cec3fd7-c8be-4ea2-b196-6262ab488fac-kube-api-access-nmvgr\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776625 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f4d9edb-87ce-41e1-9cc0-aaf07230ec92-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8l854\" (UID: \"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776651 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhlbx\" (UniqueName: \"kubernetes.io/projected/77c87234-b79b-4d2f-8ee3-b14aa050925a-kube-api-access-nhlbx\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776681 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776708 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-config\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776736 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd360a95-92b0-4cbc-b73f-87bb4274bff5-serving-cert\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776768 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8487f2-85df-4de3-a487-34f79c15ef8a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-trusted-ca\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776833 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776887 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4fhzv\" (UniqueName: \"kubernetes.io/projected/ef57ed62-2f1f-4411-974a-f4cf6c624e25-kube-api-access-4fhzv\") pod \"multus-admission-controller-857f4d67dd-m22rd\" (UID: \"ef57ed62-2f1f-4411-974a-f4cf6c624e25\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776917 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b439d8ad-3896-4c92-bd13-e094f9a63b7c-node-bootstrap-token\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776943 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2563e1e5-aeb2-4d41-857f-171e91d41281-signing-key\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776971 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/34a9a96c-7d9a-412b-ac98-76747d89f7ba-srv-cert\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.776999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bb76d\" (UniqueName: \"kubernetes.io/projected/bd360a95-92b0-4cbc-b73f-87bb4274bff5-kube-api-access-bb76d\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.777039 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d96629a-14b8-4f25-a58e-65e5eaf8b141-config-volume\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.780393 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-socket-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.781441 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-srv-cert\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.782582 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0d96629a-14b8-4f25-a58e-65e5eaf8b141-config-volume\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.782738 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.783353 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-trusted-ca\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.784026 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06381439-6997-45aa-8dce-62b012b0ac68-config-volume\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.784060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-serving-cert\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.785496 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/2d8487f2-85df-4de3-a487-34f79c15ef8a-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.785559 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-config\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.785642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.785911 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/b439d8ad-3896-4c92-bd13-e094f9a63b7c-node-bootstrap-token\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786079 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/8f920728-8d6a-43d6-989a-4ee1665c76ab-webhook-cert\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786177 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/34a9a96c-7d9a-412b-ac98-76747d89f7ba-srv-cert\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/bfa70bc3-d525-49db-94fd-316370428815-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06381439-6997-45aa-8dce-62b012b0ac68-config-volume\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786430 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786458 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2563e1e5-aeb2-4d41-857f-171e91d41281-signing-cabundle\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rmx56\" (UniqueName: \"kubernetes.io/projected/2563e1e5-aeb2-4d41-857f-171e91d41281-kube-api-access-rmx56\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786520 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f920728-8d6a-43d6-989a-4ee1665c76ab-tmpfs\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-csi-data-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786565 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786604 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/709e108a-3d2e-473b-bff3-28cb1269f598-trusted-ca\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786636 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ef57ed62-2f1f-4411-974a-f4cf6c624e25-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m22rd\" (UID: \"ef57ed62-2f1f-4411-974a-f4cf6c624e25\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786655 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-registration-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786674 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786695 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d96629a-14b8-4f25-a58e-65e5eaf8b141-metrics-tls\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786721 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42f4b6b-59e0-41f7-a6de-494a458b064b-cert\") pod \"ingress-canary-2s9dj\" (UID: \"c42f4b6b-59e0-41f7-a6de-494a458b064b\") " pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786752 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06381439-6997-45aa-8dce-62b012b0ac68-secret-volume\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786780 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd360a95-92b0-4cbc-b73f-87bb4274bff5-config\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786802 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fzpt5\" (UniqueName: \"kubernetes.io/projected/0d96629a-14b8-4f25-a58e-65e5eaf8b141-kube-api-access-fzpt5\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786853 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xxr5\" (UniqueName: \"kubernetes.io/projected/8f920728-8d6a-43d6-989a-4ee1665c76ab-kube-api-access-8xxr5\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786874 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/34a9a96c-7d9a-412b-ac98-76747d89f7ba-profile-collector-cert\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786915 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/addc805d-7a3b-48d5-8f51-febb095bf28a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rjcd2\" (UID: \"addc805d-7a3b-48d5-8f51-febb095bf28a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786925 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/2cec3fd7-c8be-4ea2-b196-6262ab488fac-proxy-tls\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.786937 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-serving-cert\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787160 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787199 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cc2pb\" (UniqueName: \"kubernetes.io/projected/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-kube-api-access-cc2pb\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787224 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrsrb\" (UniqueName: \"kubernetes.io/projected/9f4d9edb-87ce-41e1-9cc0-aaf07230ec92-kube-api-access-hrsrb\") pod \"control-plane-machine-set-operator-78cbb6b69f-8l854\" (UID: \"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787251 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b439d8ad-3896-4c92-bd13-e094f9a63b7c-certs\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787275 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c77bz\" (UniqueName: \"kubernetes.io/projected/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-kube-api-access-c77bz\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787294 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cec3fd7-c8be-4ea2-b196-6262ab488fac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787313 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-client-ca\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787369 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tbh48\" (UniqueName: \"kubernetes.io/projected/c42f4b6b-59e0-41f7-a6de-494a458b064b-kube-api-access-tbh48\") pod \"ingress-canary-2s9dj\" (UID: \"c42f4b6b-59e0-41f7-a6de-494a458b064b\") " pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787393 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/709e108a-3d2e-473b-bff3-28cb1269f598-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787422 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d66cw\" (UniqueName: \"kubernetes.io/projected/9309f110-5a80-46ca-b3de-8087048c13e2-kube-api-access-d66cw\") pod \"auto-csr-approver-29566738-zlqcq\" (UID: \"9309f110-5a80-46ca-b3de-8087048c13e2\") " pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787446 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5394f70-6289-4ea8-8169-b05a765fccfd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787469 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mrxb9\" (UniqueName: \"kubernetes.io/projected/2ce1b289-5291-4da2-afbc-0b5320e730b9-kube-api-access-mrxb9\") pod \"migrator-59844c95c7-mz2wm\" (UID: \"2ce1b289-5291-4da2-afbc-0b5320e730b9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787490 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8487f2-85df-4de3-a487-34f79c15ef8a-config\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787521 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x7vn4\" (UniqueName: \"kubernetes.io/projected/34a9a96c-7d9a-412b-ac98-76747d89f7ba-kube-api-access-x7vn4\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787542 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a4f4cd-feb2-4c87-99f7-04202818012f-service-ca-bundle\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787588 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f920728-8d6a-43d6-989a-4ee1665c76ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-config\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787628 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-plugins-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787647 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/709e108a-3d2e-473b-bff3-28cb1269f598-metrics-tls\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787668 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787694 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t8bfk\" (UniqueName: \"kubernetes.io/projected/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-kube-api-access-t8bfk\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787713 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787731 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5394f70-6289-4ea8-8169-b05a765fccfd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787772 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-mountpoint-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787794 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v2pnj\" (UniqueName: \"kubernetes.io/projected/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-kube-api-access-v2pnj\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787812 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-stats-auth\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-default-certificate\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787870 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.787888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cec3fd7-c8be-4ea2-b196-6262ab488fac-images\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.788174 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/2563e1e5-aeb2-4d41-857f-171e91d41281-signing-cabundle\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.788451 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/2cec3fd7-c8be-4ea2-b196-6262ab488fac-images\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.789388 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c5394f70-6289-4ea8-8169-b05a765fccfd-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.789572 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-mountpoint-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.789584 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-csi-data-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.790783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.791109 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/8f920728-8d6a-43d6-989a-4ee1665c76ab-tmpfs\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.791240 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-registration-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.792188 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-client-ca\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.792736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.792765 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/2cec3fd7-c8be-4ea2-b196-6262ab488fac-auth-proxy-config\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.793049 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/ddddd02c-4970-4017-a493-1f9eb50214f3-plugins-dir\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.793520 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0d96629a-14b8-4f25-a58e-65e5eaf8b141-metrics-tls\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.793560 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/81a4f4cd-feb2-4c87-99f7-04202818012f-service-ca-bundle\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.793698 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/2563e1e5-aeb2-4d41-857f-171e91d41281-signing-key\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.793959 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.293927025 +0000 UTC m=+208.384893530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.794479 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-config\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.794597 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2d8487f2-85df-4de3-a487-34f79c15ef8a-config\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.794799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bd360a95-92b0-4cbc-b73f-87bb4274bff5-config\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.795777 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/b439d8ad-3896-4c92-bd13-e094f9a63b7c-certs\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.796486 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bd360a95-92b0-4cbc-b73f-87bb4274bff5-serving-cert\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.796775 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/709e108a-3d2e-473b-bff3-28cb1269f598-trusted-ca\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.797267 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.797430 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/9f4d9edb-87ce-41e1-9cc0-aaf07230ec92-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-8l854\" (UID: \"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.798473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-serving-cert\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.800153 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.800745 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-default-certificate\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.801004 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-profile-collector-cert\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.801088 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c42f4b6b-59e0-41f7-a6de-494a458b064b-cert\") pod \"ingress-canary-2s9dj\" (UID: \"c42f4b6b-59e0-41f7-a6de-494a458b064b\") " pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.801760 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/8f920728-8d6a-43d6-989a-4ee1665c76ab-apiservice-cert\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.801976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-metrics-certs\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.802197 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-twbvs\" (UniqueName: \"kubernetes.io/projected/13ebbb5a-3355-4661-92f7-651afafe19e1-kube-api-access-twbvs\") pod \"apiserver-76f77b778f-s7p9n\" (UID: \"13ebbb5a-3355-4661-92f7-651afafe19e1\") " pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.802566 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/34a9a96c-7d9a-412b-ac98-76747d89f7ba-profile-collector-cert\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.802726 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06381439-6997-45aa-8dce-62b012b0ac68-secret-volume\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.803559 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/bfa70bc3-d525-49db-94fd-316370428815-proxy-tls\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.804184 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/709e108a-3d2e-473b-bff3-28cb1269f598-metrics-tls\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.804320 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/81a4f4cd-feb2-4c87-99f7-04202818012f-stats-auth\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.805021 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/ef57ed62-2f1f-4411-974a-f4cf6c624e25-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-m22rd\" (UID: \"ef57ed62-2f1f-4411-974a-f4cf6c624e25\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.805309 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/addc805d-7a3b-48d5-8f51-febb095bf28a-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-rjcd2\" (UID: \"addc805d-7a3b-48d5-8f51-febb095bf28a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.805942 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.806985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c5394f70-6289-4ea8-8169-b05a765fccfd-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.808464 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jtfzm\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-kube-api-access-jtfzm\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.833670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zvrqj\" (UniqueName: \"kubernetes.io/projected/aa1a0fe0-24cb-4a49-9c1c-9624889ccf31-kube-api-access-zvrqj\") pod \"cluster-samples-operator-665b6dd947-zbcmq\" (UID: \"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.849316 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.879938 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sjfrb\" (UniqueName: \"kubernetes.io/projected/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-kube-api-access-sjfrb\") pod \"controller-manager-879f6c89f-zlsr2\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.887272 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.888784 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.889354 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.389334945 +0000 UTC m=+208.480301430 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.894516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9tcpb\" (UniqueName: \"kubernetes.io/projected/ddddd02c-4970-4017-a493-1f9eb50214f3-kube-api-access-9tcpb\") pod \"csi-hostpathplugin-f7txt\" (UID: \"ddddd02c-4970-4017-a493-1f9eb50214f3\") " pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.913405 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dl6dx\" (UniqueName: \"kubernetes.io/projected/06381439-6997-45aa-8dce-62b012b0ac68-kube-api-access-dl6dx\") pod \"collect-profiles-29566725-gdkpg\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.934623 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nxnz5\" (UniqueName: \"kubernetes.io/projected/b439d8ad-3896-4c92-bd13-e094f9a63b7c-kube-api-access-nxnz5\") pod \"machine-config-server-g9kg7\" (UID: \"b439d8ad-3896-4c92-bd13-e094f9a63b7c\") " pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.950811 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jwsx\" (UniqueName: \"kubernetes.io/projected/bfa70bc3-d525-49db-94fd-316370428815-kube-api-access-5jwsx\") pod \"machine-config-controller-84d6567774-n69m8\" (UID: \"bfa70bc3-d525-49db-94fd-316370428815\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.951524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.960871 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-g9kg7" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.975029 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mtzl8\" (UniqueName: \"kubernetes.io/projected/81a4f4cd-feb2-4c87-99f7-04202818012f-kube-api-access-mtzl8\") pod \"router-default-5444994796-9drnj\" (UID: \"81a4f4cd-feb2-4c87-99f7-04202818012f\") " pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.980917 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-lg2z9"] Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.981775 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-m6mzp"] Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.982369 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-5r5fg"] Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.989538 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/2d8487f2-85df-4de3-a487-34f79c15ef8a-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-j6hfm\" (UID: \"2d8487f2-85df-4de3-a487-34f79c15ef8a\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:41 crc kubenswrapper[4772]: I0320 10:58:41.995920 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:41 crc kubenswrapper[4772]: E0320 10:58:41.996436 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.496420503 +0000 UTC m=+208.587386988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.011889 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c5394f70-6289-4ea8-8169-b05a765fccfd-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-r9gfq\" (UID: \"c5394f70-6289-4ea8-8169-b05a765fccfd\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.024005 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.035674 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.036828 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.051011 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hq2ch\" (UniqueName: \"kubernetes.io/projected/709e108a-3d2e-473b-bff3-28cb1269f598-kube-api-access-hq2ch\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.060906 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.077033 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmvgr\" (UniqueName: \"kubernetes.io/projected/2cec3fd7-c8be-4ea2-b196-6262ab488fac-kube-api-access-nmvgr\") pod \"machine-config-operator-74547568cd-ctkwn\" (UID: \"2cec3fd7-c8be-4ea2-b196-6262ab488fac\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.090799 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhlbx\" (UniqueName: \"kubernetes.io/projected/77c87234-b79b-4d2f-8ee3-b14aa050925a-kube-api-access-nhlbx\") pod \"marketplace-operator-79b997595-6fhz7\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.102747 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.104889 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-mktq6"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.108054 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.108179 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.608159252 +0000 UTC m=+208.699125737 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.108613 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.108991 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.608973535 +0000 UTC m=+208.699940020 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.109106 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.116681 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.128834 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-stnlj\" (UniqueName: \"kubernetes.io/projected/addc805d-7a3b-48d5-8f51-febb095bf28a-kube-api-access-stnlj\") pod \"package-server-manager-789f6589d5-rjcd2\" (UID: \"addc805d-7a3b-48d5-8f51-febb095bf28a\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.129805 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bb76d\" (UniqueName: \"kubernetes.io/projected/bd360a95-92b0-4cbc-b73f-87bb4274bff5-kube-api-access-bb76d\") pod \"service-ca-operator-777779d784-c8tzc\" (UID: \"bd360a95-92b0-4cbc-b73f-87bb4274bff5\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.130796 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-lvstj"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.132070 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.132179 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.150481 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4fhzv\" (UniqueName: \"kubernetes.io/projected/ef57ed62-2f1f-4411-974a-f4cf6c624e25-kube-api-access-4fhzv\") pod \"multus-admission-controller-857f4d67dd-m22rd\" (UID: \"ef57ed62-2f1f-4411-974a-f4cf6c624e25\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.162796 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcbae76fd_5602_4144_907a_5aeaafbedee7.slice/crio-fa4e6cac6c5c103f3f59ce11bcffab6a32530330bdce4003cca0923529460c39 WatchSource:0}: Error finding container fa4e6cac6c5c103f3f59ce11bcffab6a32530330bdce4003cca0923529460c39: Status 404 returned error can't find the container with id fa4e6cac6c5c103f3f59ce11bcffab6a32530330bdce4003cca0923529460c39 Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.173402 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45c4e1f3_f8fb_41c5_96a1_dc5f86f256e6.slice/crio-9e5e4393eeb05b290cf3f4d141d61da26fe4e82767e340e068a23bf69ee5dab8 WatchSource:0}: Error finding container 9e5e4393eeb05b290cf3f4d141d61da26fe4e82767e340e068a23bf69ee5dab8: Status 404 returned error can't find the container with id 9e5e4393eeb05b290cf3f4d141d61da26fe4e82767e340e068a23bf69ee5dab8 Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.174407 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rmx56\" (UniqueName: \"kubernetes.io/projected/2563e1e5-aeb2-4d41-857f-171e91d41281-kube-api-access-rmx56\") pod \"service-ca-9c57cc56f-p564j\" (UID: \"2563e1e5-aeb2-4d41-857f-171e91d41281\") " pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.174564 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.175066 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec8f3e81_ffd3_493f_93e4_d00e371c923c.slice/crio-e567e3d97886d45537280a211a1863a54e9e6ee905e23ccf5f7970906639a947 WatchSource:0}: Error finding container e567e3d97886d45537280a211a1863a54e9e6ee905e23ccf5f7970906639a947: Status 404 returned error can't find the container with id e567e3d97886d45537280a211a1863a54e9e6ee905e23ccf5f7970906639a947 Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.194043 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fzpt5\" (UniqueName: \"kubernetes.io/projected/0d96629a-14b8-4f25-a58e-65e5eaf8b141-kube-api-access-fzpt5\") pod \"dns-default-7xktg\" (UID: \"0d96629a-14b8-4f25-a58e-65e5eaf8b141\") " pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.209630 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.211223 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.711207627 +0000 UTC m=+208.802174112 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.211500 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x7vn4\" (UniqueName: \"kubernetes.io/projected/34a9a96c-7d9a-412b-ac98-76747d89f7ba-kube-api-access-x7vn4\") pod \"catalog-operator-68c6474976-dmbpv\" (UID: \"34a9a96c-7d9a-412b-ac98-76747d89f7ba\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.229678 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.234482 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t8bfk\" (UniqueName: \"kubernetes.io/projected/ca311ab1-1d0d-4a92-9a6d-f14f1e269333-kube-api-access-t8bfk\") pod \"cluster-image-registry-operator-dc59b4c8b-4vpft\" (UID: \"ca311ab1-1d0d-4a92-9a6d-f14f1e269333\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.235134 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.244058 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-p564j" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.254091 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrsrb\" (UniqueName: \"kubernetes.io/projected/9f4d9edb-87ce-41e1-9cc0-aaf07230ec92-kube-api-access-hrsrb\") pod \"control-plane-machine-set-operator-78cbb6b69f-8l854\" (UID: \"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.260859 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-fgwgm"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.264775 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rq497"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.273660 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cc2pb\" (UniqueName: \"kubernetes.io/projected/65a4877c-95f9-4a93-9c74-ecde8d9a7b95-kube-api-access-cc2pb\") pod \"console-operator-58897d9998-zlklv\" (UID: \"65a4877c-95f9-4a93-9c74-ecde8d9a7b95\") " pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.292253 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v2pnj\" (UniqueName: \"kubernetes.io/projected/24b9f616-8834-4b47-8b4a-1d25d5efb4f2-kube-api-access-v2pnj\") pod \"kube-storage-version-migrator-operator-b67b599dd-5nnrh\" (UID: \"24b9f616-8834-4b47-8b4a-1d25d5efb4f2\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.309187 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-f7txt"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.323879 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.324512 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.82449263 +0000 UTC m=+208.915459115 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.326956 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-drz9m"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.328546 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.328885 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.335588 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/709e108a-3d2e-473b-bff3-28cb1269f598-bound-sa-token\") pod \"ingress-operator-5b745b69d9-wk6fq\" (UID: \"709e108a-3d2e-473b-bff3-28cb1269f598\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.339981 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tbh48\" (UniqueName: \"kubernetes.io/projected/c42f4b6b-59e0-41f7-a6de-494a458b064b-kube-api-access-tbh48\") pod \"ingress-canary-2s9dj\" (UID: \"c42f4b6b-59e0-41f7-a6de-494a458b064b\") " pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.344596 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.353772 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.354045 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d66cw\" (UniqueName: \"kubernetes.io/projected/9309f110-5a80-46ca-b3de-8087048c13e2-kube-api-access-d66cw\") pod \"auto-csr-approver-29566738-zlqcq\" (UID: \"9309f110-5a80-46ca-b3de-8087048c13e2\") " pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.368967 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.376687 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mrxb9\" (UniqueName: \"kubernetes.io/projected/2ce1b289-5291-4da2-afbc-0b5320e730b9-kube-api-access-mrxb9\") pod \"migrator-59844c95c7-mz2wm\" (UID: \"2ce1b289-5291-4da2-afbc-0b5320e730b9\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.378813 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.387582 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.396482 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.396754 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.407593 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xxr5\" (UniqueName: \"kubernetes.io/projected/8f920728-8d6a-43d6-989a-4ee1665c76ab-kube-api-access-8xxr5\") pod \"packageserver-d55dfcdfc-zcpz4\" (UID: \"8f920728-8d6a-43d6-989a-4ee1665c76ab\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.414013 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c77bz\" (UniqueName: \"kubernetes.io/projected/3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5-kube-api-access-c77bz\") pod \"olm-operator-6b444d44fb-c2hdq\" (UID: \"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.419155 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.425403 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.425796 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.426085 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:42.926051532 +0000 UTC m=+209.017018017 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.428428 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-s7p9n"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.439889 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.444775 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.457115 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.463419 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2d8487f2_85df_4de3_a487_34f79c15ef8a.slice/crio-ab0760829a3296992509fed2cc3978f04d088e0c777c556f5c90598befee4961 WatchSource:0}: Error finding container ab0760829a3296992509fed2cc3978f04d088e0c777c556f5c90598befee4961: Status 404 returned error can't find the container with id ab0760829a3296992509fed2cc3978f04d088e0c777c556f5c90598befee4961 Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.477350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.508075 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.517896 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.520292 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zlsr2"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.526975 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.527283 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.027270306 +0000 UTC m=+209.118236791 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.536611 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod06381439_6997_45aa_8dce_62b012b0ac68.slice/crio-6db455a1c0890d2f4c332ded4da5f2ba5e4832736b33b1cd3b9d8218edc37c8e WatchSource:0}: Error finding container 6db455a1c0890d2f4c332ded4da5f2ba5e4832736b33b1cd3b9d8218edc37c8e: Status 404 returned error can't find the container with id 6db455a1c0890d2f4c332ded4da5f2ba5e4832736b33b1cd3b9d8218edc37c8e Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.586298 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-2s9dj" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.623393 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.629440 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.629717 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.129700524 +0000 UTC m=+209.220667009 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.629796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.630124 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.130116245 +0000 UTC m=+209.221082730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.692240 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6fhz7"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.718769 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.733003 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.733854 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.233820699 +0000 UTC m=+209.324787194 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.741014 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbfa70bc3_d525_49db_94fd_316370428815.slice/crio-9453127e07e5c94fdb9ce89d27e43ef1b05d55205cdde7e7a6de4a323cfb517f WatchSource:0}: Error finding container 9453127e07e5c94fdb9ce89d27e43ef1b05d55205cdde7e7a6de4a323cfb517f: Status 404 returned error can't find the container with id 9453127e07e5c94fdb9ce89d27e43ef1b05d55205cdde7e7a6de4a323cfb517f Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.744309 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g9kg7" event={"ID":"b439d8ad-3896-4c92-bd13-e094f9a63b7c","Type":"ContainerStarted","Data":"8a3fb818c549a80bbedb6b4764a83e597950231de1d26c4248b772f96833fccf"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.744341 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-g9kg7" event={"ID":"b439d8ad-3896-4c92-bd13-e094f9a63b7c","Type":"ContainerStarted","Data":"ac29415c2ad14f64938f9319003b75397ff8b3a66aafe4c71784eeb875b6b20b"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.745473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" event={"ID":"13ebbb5a-3355-4661-92f7-651afafe19e1","Type":"ContainerStarted","Data":"55b39d6d2e727f5e92e7d7b75392455be5d5187df08d277792c90cdf70a3e450"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.746600 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" event={"ID":"cbae76fd-5602-4144-907a-5aeaafbedee7","Type":"ContainerStarted","Data":"fa4e6cac6c5c103f3f59ce11bcffab6a32530330bdce4003cca0923529460c39"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.749167 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lg2z9" event={"ID":"6872c5d1-0892-4abc-9c68-5fe459ed1107","Type":"ContainerStarted","Data":"ec30e457458705572942f6011be090081619e8c1c9546eac059f65fee02a0407"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.749188 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-lg2z9" event={"ID":"6872c5d1-0892-4abc-9c68-5fe459ed1107","Type":"ContainerStarted","Data":"3b809151f0be9677b82d9102296298b1c4941de1ddf2dcbf97216722d8483d75"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.749802 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.752293 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2z9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.752327 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2z9" podUID="6872c5d1-0892-4abc-9c68-5fe459ed1107" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.753056 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" event={"ID":"c330bb63-a0e1-4650-b3f9-71e0bf85da61","Type":"ContainerStarted","Data":"e7a2048d99683314f3daddababc1f363f19fde07ac3b111e793ceeb7e2875c40"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.763532 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" event={"ID":"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6","Type":"ContainerStarted","Data":"b164d764ad538de648f5be3353959edca49979a496a9c866c5fe05d14dca9816"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.763578 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" event={"ID":"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6","Type":"ContainerStarted","Data":"9e5e4393eeb05b290cf3f4d141d61da26fe4e82767e340e068a23bf69ee5dab8"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.781293 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" event={"ID":"ec8f3e81-ffd3-493f-93e4-d00e371c923c","Type":"ContainerStarted","Data":"e567e3d97886d45537280a211a1863a54e9e6ee905e23ccf5f7970906639a947"} Mar 20 10:58:42 crc kubenswrapper[4772]: W0320 10:58:42.781473 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod77c87234_b79b_4d2f_8ee3_b14aa050925a.slice/crio-f2f2b410f180f03d7ab26a78482481df27137452b69b0a8549214b4180af066d WatchSource:0}: Error finding container f2f2b410f180f03d7ab26a78482481df27137452b69b0a8549214b4180af066d: Status 404 returned error can't find the container with id f2f2b410f180f03d7ab26a78482481df27137452b69b0a8549214b4180af066d Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.784907 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" event={"ID":"ddddd02c-4970-4017-a493-1f9eb50214f3","Type":"ContainerStarted","Data":"72b8926acc84c3ea84d3a8c70cc15422bf387476cd92fc1c764a89a9a2ca9ab8"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.790891 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" event={"ID":"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5","Type":"ContainerStarted","Data":"f50ba9df3a18bb3155114b346bc96ce3aeb593a4406d18410ed48f830029ac02"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.811968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" event={"ID":"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31","Type":"ContainerStarted","Data":"57aa4ad3b491c6baae00dd808109d91e378a83fbd8e1d755a414539e9612eac4"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.818789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" event={"ID":"bf2c75a2-ca6a-415f-80ea-830f55899119","Type":"ContainerStarted","Data":"6a408b09184bbaed00a77d617df0060cc0c91e0a595583865a94eb22242f817f"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.825946 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" event={"ID":"908b84fc-c766-408a-905f-79ddf440ba2b","Type":"ContainerStarted","Data":"f4ae4ec5ae7ee7f603eb9ca35fce2ab7fa11d1bde36d9844996ac2a6f4a11391"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.825983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" event={"ID":"908b84fc-c766-408a-905f-79ddf440ba2b","Type":"ContainerStarted","Data":"a709e557edbb997d832bb88ff95547f7935a329ab2870848e0da2dc16a0a7306"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.830789 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc"] Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.832131 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fgwgm" event={"ID":"f7c20397-4233-45e6-a7f9-5e88942e7abf","Type":"ContainerStarted","Data":"992c5443a50bbbf5c4d7a2cd973bfdefb8244e6d96035d5e04e58f9cfe3739c0"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.835422 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.837644 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.337631855 +0000 UTC m=+209.428598340 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.839115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" event={"ID":"1ede39ac-a466-4925-8a5a-1dd6679b1915","Type":"ContainerStarted","Data":"82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.839147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" event={"ID":"1ede39ac-a466-4925-8a5a-1dd6679b1915","Type":"ContainerStarted","Data":"a9d2f1d8482e6ba0c77c8f1fb9edb5b9aa5ce4be38802e1ff1741f2c811d27e1"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.839890 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.859352 4772 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-schhc container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.859397 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" podUID="1ede39ac-a466-4925-8a5a-1dd6679b1915" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.860603 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" event={"ID":"a5752755-110f-47d1-b9cf-ff3e35aabf8f","Type":"ContainerStarted","Data":"3a11e310d9bbe1509a5fc6f50401ac2c8dbb75225bef8fbfc11ba9ea23bb3526"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.860634 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" event={"ID":"a5752755-110f-47d1-b9cf-ff3e35aabf8f","Type":"ContainerStarted","Data":"8f4616903b09916dc9d1e7d4493e90c8d1c9ed1d58722ab91635e513cf5cfb69"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.863877 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" event={"ID":"2d8487f2-85df-4de3-a487-34f79c15ef8a","Type":"ContainerStarted","Data":"ab0760829a3296992509fed2cc3978f04d088e0c777c556f5c90598befee4961"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.867619 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9drnj" event={"ID":"81a4f4cd-feb2-4c87-99f7-04202818012f","Type":"ContainerStarted","Data":"0e534d2e6412ef8c21bf6770652da1101d93fa76da718b775e8ea55d88d905ca"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.879390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" event={"ID":"cdae4fbe-72ef-47c6-a521-120230421079","Type":"ContainerStarted","Data":"56ea6360962dbceb15f13cae6f93dda5523c6292259f65953d02fec40cbd0914"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.885294 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" event={"ID":"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3","Type":"ContainerStarted","Data":"496c1f4fcea66366a2317b5b80cba50a4e9c9520dea045ccaed0f798745d8a6a"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.886234 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" event={"ID":"06381439-6997-45aa-8dce-62b012b0ac68","Type":"ContainerStarted","Data":"6db455a1c0890d2f4c332ded4da5f2ba5e4832736b33b1cd3b9d8218edc37c8e"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.908333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" event={"ID":"4bc1666f-8e18-4c98-8424-01fe5598d275","Type":"ContainerStarted","Data":"713e4302ec3af0563d4af6eb975dbcbaa85ef6c131c6f785f7303c39656a46b1"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.920887 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" event={"ID":"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a","Type":"ContainerStarted","Data":"ab716c888d19dfbd362a17979b5f1cf86d41ceb7398a802c5ae79d8d6098c456"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.920954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" event={"ID":"b19a40a6-c2c9-47ec-8da6-bd23833c5a4a","Type":"ContainerStarted","Data":"e468e0d8168b9ccc84ec4b810042f7285a9d7a07ba2295db458d6c2713fe8510"} Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.937481 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.937818 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.437781049 +0000 UTC m=+209.528747534 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.938877 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:42 crc kubenswrapper[4772]: E0320 10:58:42.939371 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.439356142 +0000 UTC m=+209.530322627 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:42 crc kubenswrapper[4772]: I0320 10:58:42.946227 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-p564j"] Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.041502 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.041728 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.541680197 +0000 UTC m=+209.632646692 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.042036 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.042410 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.542396447 +0000 UTC m=+209.633362932 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.087798 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-g9kg7" podStartSLOduration=4.087770172 podStartE2EDuration="4.087770172s" podCreationTimestamp="2026-03-20 10:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:43.08625642 +0000 UTC m=+209.177222915" watchObservedRunningTime="2026-03-20 10:58:43.087770172 +0000 UTC m=+209.178736657" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.144700 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.144960 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.644910168 +0000 UTC m=+209.735876653 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.145323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.145757 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.645747581 +0000 UTC m=+209.736714066 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.152253 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50306: no serving certificate available for the kubelet" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.210471 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" podStartSLOduration=165.210452958 podStartE2EDuration="2m45.210452958s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:43.208562106 +0000 UTC m=+209.299528591" watchObservedRunningTime="2026-03-20 10:58:43.210452958 +0000 UTC m=+209.301419443" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.243775 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50316: no serving certificate available for the kubelet" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.259472 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.259682 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.759650201 +0000 UTC m=+209.850616686 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.259851 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.260159 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.760151845 +0000 UTC m=+209.851118330 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.303533 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-dtdpj" podStartSLOduration=166.303514143 podStartE2EDuration="2m46.303514143s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:43.247891441 +0000 UTC m=+209.338857936" watchObservedRunningTime="2026-03-20 10:58:43.303514143 +0000 UTC m=+209.394480628" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.304060 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-d2cpm" podStartSLOduration=166.304053859 podStartE2EDuration="2m46.304053859s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:43.300999733 +0000 UTC m=+209.391966218" watchObservedRunningTime="2026-03-20 10:58:43.304053859 +0000 UTC m=+209.395020344" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.360966 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.361391 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.861346618 +0000 UTC m=+209.952313093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.361504 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.361815 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.86180166 +0000 UTC m=+209.952768145 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.362247 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50332: no serving certificate available for the kubelet" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.448798 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50336: no serving certificate available for the kubelet" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.462242 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.462643 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:43.962629193 +0000 UTC m=+210.053595678 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.515981 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2"] Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.559911 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50352: no serving certificate available for the kubelet" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.569370 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft"] Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.571405 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.571730 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.071658166 +0000 UTC m=+210.162624681 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.606205 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-8ftlb" podStartSLOduration=166.606187927 podStartE2EDuration="2m46.606187927s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:43.605657791 +0000 UTC m=+209.696624276" watchObservedRunningTime="2026-03-20 10:58:43.606187927 +0000 UTC m=+209.697154402" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.644987 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-lg2z9" podStartSLOduration=166.644970715 podStartE2EDuration="2m46.644970715s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:43.642929809 +0000 UTC m=+209.733896294" watchObservedRunningTime="2026-03-20 10:58:43.644970715 +0000 UTC m=+209.735937200" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.672574 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.673060 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.173042834 +0000 UTC m=+210.264009319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.774833 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.775480 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.275469002 +0000 UTC m=+210.366435487 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.780077 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50368: no serving certificate available for the kubelet" Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.876335 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.878383 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.378358802 +0000 UTC m=+210.469325287 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.942359 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" event={"ID":"77c87234-b79b-4d2f-8ee3-b14aa050925a","Type":"ContainerStarted","Data":"f2f2b410f180f03d7ab26a78482481df27137452b69b0a8549214b4180af066d"} Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.981120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:43 crc kubenswrapper[4772]: E0320 10:58:43.982079 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.482062896 +0000 UTC m=+210.573029381 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:43 crc kubenswrapper[4772]: I0320 10:58:43.993702 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50376: no serving certificate available for the kubelet" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.005502 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fgwgm" event={"ID":"f7c20397-4233-45e6-a7f9-5e88942e7abf","Type":"ContainerStarted","Data":"5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.016473 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.016530 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.062987 4772 generic.go:334] "Generic (PLEG): container finished" podID="cdae4fbe-72ef-47c6-a521-120230421079" containerID="512c50513e84f681035a6cece8893c15a6bcd6d20616621c54964e33a3835a5c" exitCode=0 Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.063091 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" event={"ID":"cdae4fbe-72ef-47c6-a521-120230421079","Type":"ContainerDied","Data":"512c50513e84f681035a6cece8893c15a6bcd6d20616621c54964e33a3835a5c"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.074281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" event={"ID":"bfa70bc3-d525-49db-94fd-316370428815","Type":"ContainerStarted","Data":"9453127e07e5c94fdb9ce89d27e43ef1b05d55205cdde7e7a6de4a323cfb517f"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.082041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-p564j" event={"ID":"2563e1e5-aeb2-4d41-857f-171e91d41281","Type":"ContainerStarted","Data":"61be50bf4bc124abf81a9f2ba282ce698f40c7f0df85b8a867aebaa573640dce"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.085442 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.086485 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.586465799 +0000 UTC m=+210.677432284 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.102564 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" event={"ID":"bd360a95-92b0-4cbc-b73f-87bb4274bff5","Type":"ContainerStarted","Data":"94a0876464dcf346db1e3c89c278851043e4a9a0bffc4d4633f20fa27505828d"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.104864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" event={"ID":"cbae76fd-5602-4144-907a-5aeaafbedee7","Type":"ContainerStarted","Data":"53e3d982810cff836b1da92bfcb0e9234191a4b8974a458c1dce3d7ee5c4f69d"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.123910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" event={"ID":"addc805d-7a3b-48d5-8f51-febb095bf28a","Type":"ContainerStarted","Data":"ed683dc898b6e7e33f7c9726dd75bb3e64ed1b9f8ec39b536b78cb437d7de959"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.132590 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" event={"ID":"c5394f70-6289-4ea8-8169-b05a765fccfd","Type":"ContainerStarted","Data":"d79da4130a52068331dcfacbc8ecb18c07160bd438fb769bfd906757689b29c1"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.137359 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" event={"ID":"ec8f3e81-ffd3-493f-93e4-d00e371c923c","Type":"ContainerStarted","Data":"b1467f4b7d76624d92cb19786f6de46352a50080c60244351e89d725618910f9"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.140178 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" event={"ID":"ca311ab1-1d0d-4a92-9a6d-f14f1e269333","Type":"ContainerStarted","Data":"37d8c36b1e4e929305d769afd7e7337aa60279f54f3c004d733cf7c701709380"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.142373 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-9drnj" event={"ID":"81a4f4cd-feb2-4c87-99f7-04202818012f","Type":"ContainerStarted","Data":"9a312c898c9331edd6a552e1a2099bc9e2af817a91b3c9443dbd9c66f48eee57"} Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.143446 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2z9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.143486 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2z9" podUID="6872c5d1-0892-4abc-9c68-5fe459ed1107" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.149275 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.190150 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.191915 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn"] Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.195689 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.695671746 +0000 UTC m=+210.786638231 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.226949 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-5r5fg" podStartSLOduration=167.226910594 podStartE2EDuration="2m47.226910594s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:44.213512677 +0000 UTC m=+210.304479162" watchObservedRunningTime="2026-03-20 10:58:44.226910594 +0000 UTC m=+210.317877079" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.233816 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.235942 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-zlklv"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.238667 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-7xktg"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.244105 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.273347 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.308731 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.309032 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.80901031 +0000 UTC m=+210.899976795 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.309733 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.310513 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.810488802 +0000 UTC m=+210.901455287 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.352337 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-zlqcq"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.356704 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50388: no serving certificate available for the kubelet" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.364362 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.366135 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-m22rd"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.399751 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-5fk2x" podStartSLOduration=167.399733229 podStartE2EDuration="2m47.399733229s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:44.380037926 +0000 UTC m=+210.471004411" watchObservedRunningTime="2026-03-20 10:58:44.399733229 +0000 UTC m=+210.490699714" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.400086 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.412967 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.413469 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:44.913437544 +0000 UTC m=+211.004404029 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.430996 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-2s9dj"] Mar 20 10:58:44 crc kubenswrapper[4772]: W0320 10:58:44.460362 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc42f4b6b_59e0_41f7_a6de_494a458b064b.slice/crio-9ef9a0820da6ff91f4b4f807eb66988fda01c9f62c1520a5e62d10e70a2ce99a WatchSource:0}: Error finding container 9ef9a0820da6ff91f4b4f807eb66988fda01c9f62c1520a5e62d10e70a2ce99a: Status 404 returned error can't find the container with id 9ef9a0820da6ff91f4b4f807eb66988fda01c9f62c1520a5e62d10e70a2ce99a Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.460517 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-9drnj" podStartSLOduration=167.460501436 podStartE2EDuration="2m47.460501436s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:44.459329073 +0000 UTC m=+210.550295558" watchObservedRunningTime="2026-03-20 10:58:44.460501436 +0000 UTC m=+210.551467921" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.474802 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.492939 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-mktq6" podStartSLOduration=167.492923467 podStartE2EDuration="2m47.492923467s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:44.491620091 +0000 UTC m=+210.582586576" watchObservedRunningTime="2026-03-20 10:58:44.492923467 +0000 UTC m=+210.583889952" Mar 20 10:58:44 crc kubenswrapper[4772]: W0320 10:58:44.498977 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef57ed62_2f1f_4411_974a_f4cf6c624e25.slice/crio-1d1a22a499a95cc11bc7f62272bc6cfdabd7f107d6729b7a2883fa5f61c98848 WatchSource:0}: Error finding container 1d1a22a499a95cc11bc7f62272bc6cfdabd7f107d6729b7a2883fa5f61c98848: Status 404 returned error can't find the container with id 1d1a22a499a95cc11bc7f62272bc6cfdabd7f107d6729b7a2883fa5f61c98848 Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.517517 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zlsr2"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.518543 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.518875 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.018863945 +0000 UTC m=+211.109830430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.620743 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.621079 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.121045447 +0000 UTC m=+211.212011932 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.621385 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.622019 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.121987213 +0000 UTC m=+211.212953698 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.634578 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-fgwgm" podStartSLOduration=167.634555326 podStartE2EDuration="2m47.634555326s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:44.628366012 +0000 UTC m=+210.719332507" watchObservedRunningTime="2026-03-20 10:58:44.634555326 +0000 UTC m=+210.725521811" Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.697022 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc"] Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.722764 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.723258 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.223244427 +0000 UTC m=+211.314210912 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.824687 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.824963 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.324952825 +0000 UTC m=+211.415919310 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:44 crc kubenswrapper[4772]: I0320 10:58:44.925717 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:44 crc kubenswrapper[4772]: E0320 10:58:44.926959 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.42694343 +0000 UTC m=+211.517909915 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.029287 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.030074 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.530057337 +0000 UTC m=+211.621023832 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.033254 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50392: no serving certificate available for the kubelet" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.103417 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.108741 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:45 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:45 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:45 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.108780 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.130638 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.131040 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.631022163 +0000 UTC m=+211.721988648 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.198795 4772 generic.go:334] "Generic (PLEG): container finished" podID="bf2c75a2-ca6a-415f-80ea-830f55899119" containerID="60ae45a83030894046a947035619c6b99e3c3d6653c36147be9548c086e0f238" exitCode=0 Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.198897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" event={"ID":"bf2c75a2-ca6a-415f-80ea-830f55899119","Type":"ContainerDied","Data":"60ae45a83030894046a947035619c6b99e3c3d6653c36147be9548c086e0f238"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.203491 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" event={"ID":"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31","Type":"ContainerStarted","Data":"0051d9ac0c2ab396a68abf73b722e77d291f4376405d791023732b43173d82e4"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.203521 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" event={"ID":"aa1a0fe0-24cb-4a49-9c1c-9624889ccf31","Type":"ContainerStarted","Data":"1df7ebb329c72d16989437262f15a14bbb603e1d49a8df0eecddea808b540b9c"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.214465 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" event={"ID":"ef57ed62-2f1f-4411-974a-f4cf6c624e25","Type":"ContainerStarted","Data":"1d1a22a499a95cc11bc7f62272bc6cfdabd7f107d6729b7a2883fa5f61c98848"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.233022 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.233512 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" event={"ID":"c5394f70-6289-4ea8-8169-b05a765fccfd","Type":"ContainerStarted","Data":"172375d7a604f986a29deca8174f7bf2d7ccdc924542d5c76ea16abb339b3289"} Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.234904 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.734887811 +0000 UTC m=+211.825854366 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.240405 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-zbcmq" podStartSLOduration=168.240384335 podStartE2EDuration="2m48.240384335s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.24017802 +0000 UTC m=+211.331144505" watchObservedRunningTime="2026-03-20 10:58:45.240384335 +0000 UTC m=+211.331350840" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.244806 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" event={"ID":"bfa70bc3-d525-49db-94fd-316370428815","Type":"ContainerStarted","Data":"eb4c0519d2bb86c875ae8de16415303a91fc74c9c96c54e6e8757664a9c845be"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.253366 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" event={"ID":"06381439-6997-45aa-8dce-62b012b0ac68","Type":"ContainerStarted","Data":"e53c63cafd33fbfa4e94f437ac55a29be8c86aee585aa730bb52cab188476104"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.273461 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" event={"ID":"2cec3fd7-c8be-4ea2-b196-6262ab488fac","Type":"ContainerStarted","Data":"fbf7cc955cdc6bef1b354a5dd68ba2bb1b621be2198bcb54bbb963883e688a2f"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.273509 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" event={"ID":"2cec3fd7-c8be-4ea2-b196-6262ab488fac","Type":"ContainerStarted","Data":"6b8bdc77c939be4998996fb43e69c2fdd6265f2bf898e0cfe813b570713d7958"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.276242 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-r9gfq" podStartSLOduration=168.276224673 podStartE2EDuration="2m48.276224673s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.257214168 +0000 UTC m=+211.348180653" watchObservedRunningTime="2026-03-20 10:58:45.276224673 +0000 UTC m=+211.367191158" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.278234 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" podStartSLOduration=168.278225538 podStartE2EDuration="2m48.278225538s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.275204533 +0000 UTC m=+211.366171018" watchObservedRunningTime="2026-03-20 10:58:45.278225538 +0000 UTC m=+211.369192023" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.319787 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" event={"ID":"addc805d-7a3b-48d5-8f51-febb095bf28a","Type":"ContainerStarted","Data":"ad4ce7200ba3538485604168a2c39810ac02cf0cdc8691b22eca3a284c382ad0"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.319904 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" event={"ID":"addc805d-7a3b-48d5-8f51-febb095bf28a","Type":"ContainerStarted","Data":"dab6c506cc02f04e3b00de7067ec05d62e65eb5ae2dd09d307d9c30ed93c6b08"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.322899 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.334081 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.335657 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.835636501 +0000 UTC m=+211.926602986 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.348122 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" podStartSLOduration=167.348101051 podStartE2EDuration="2m47.348101051s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.347007451 +0000 UTC m=+211.437973936" watchObservedRunningTime="2026-03-20 10:58:45.348101051 +0000 UTC m=+211.439067536" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.349207 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" event={"ID":"cdae4fbe-72ef-47c6-a521-120230421079","Type":"ContainerStarted","Data":"e781973ef2ad67838f1ae5589a6d23a6520e9ea47c90684c29655fc2f9545e9c"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.349476 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.354650 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" event={"ID":"24b9f616-8834-4b47-8b4a-1d25d5efb4f2","Type":"ContainerStarted","Data":"c8bb6569ed038d03435547e171002242572ebb119c5d6bb44d414d616cb331b3"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.376659 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" podStartSLOduration=168.376636413 podStartE2EDuration="2m48.376636413s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.375758818 +0000 UTC m=+211.466725303" watchObservedRunningTime="2026-03-20 10:58:45.376636413 +0000 UTC m=+211.467602898" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.379695 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" event={"ID":"bd360a95-92b0-4cbc-b73f-87bb4274bff5","Type":"ContainerStarted","Data":"3fd017b3a753cff907752fc3bfb65b6219f3b56eafd06ce64e0671968df8a4c1"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.395474 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7xktg" event={"ID":"0d96629a-14b8-4f25-a58e-65e5eaf8b141","Type":"ContainerStarted","Data":"522075eb2c1397979c61230ee549998fc9318b1a8383184f37febd824f2a283c"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.395526 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7xktg" event={"ID":"0d96629a-14b8-4f25-a58e-65e5eaf8b141","Type":"ContainerStarted","Data":"04577a24c9f736a9b406f76e4a875d7baca4f8d18614da33d60584556f5c46f4"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.400096 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" podStartSLOduration=168.400075262 podStartE2EDuration="2m48.400075262s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.395623456 +0000 UTC m=+211.486589941" watchObservedRunningTime="2026-03-20 10:58:45.400075262 +0000 UTC m=+211.491041747" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.425393 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" event={"ID":"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5","Type":"ContainerStarted","Data":"2dfade01f882d7df40bc7fc6f824a2bf3401d5794c2e52d35a8a8fd74cd488ac"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.425461 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" event={"ID":"3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5","Type":"ContainerStarted","Data":"5d72758354711d280b4b7630438a8f87c0f53c4aa1e06e7cd670ff55cfcdd580"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.427306 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.438689 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.443556 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:45.943536442 +0000 UTC m=+212.034502927 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.449495 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" event={"ID":"ddddd02c-4970-4017-a493-1f9eb50214f3","Type":"ContainerStarted","Data":"22efd482272786088938d270374e4623b4b1a2e1b7a3904bf61b442d726b9815"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.453152 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-c8tzc" podStartSLOduration=167.453130112 podStartE2EDuration="2m47.453130112s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.452190915 +0000 UTC m=+211.543157400" watchObservedRunningTime="2026-03-20 10:58:45.453130112 +0000 UTC m=+211.544096597" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.455706 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2s9dj" event={"ID":"c42f4b6b-59e0-41f7-a6de-494a458b064b","Type":"ContainerStarted","Data":"9ef9a0820da6ff91f4b4f807eb66988fda01c9f62c1520a5e62d10e70a2ce99a"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.480281 4772 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-c2hdq container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.480349 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" podUID="3200bb8e-57ed-4de6-9d9e-fcda35a6bdc5" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.31:8443/healthz\": dial tcp 10.217.0.31:8443: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.481994 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" event={"ID":"ca311ab1-1d0d-4a92-9a6d-f14f1e269333","Type":"ContainerStarted","Data":"ba42b225c1ea46aca0fa2a327ce1a5960b03d1e7bf76e417363b509f3d59d9d0"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.501211 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" podStartSLOduration=167.501186492 podStartE2EDuration="2m47.501186492s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.492609641 +0000 UTC m=+211.583576126" watchObservedRunningTime="2026-03-20 10:58:45.501186492 +0000 UTC m=+211.592152977" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.508072 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" event={"ID":"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5","Type":"ContainerStarted","Data":"2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.509334 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.516914 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" event={"ID":"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3","Type":"ContainerStarted","Data":"49985016d2b2c076bea7c51c747231f67f5ec2b3fa9d53f797a1a25556593f54"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.520268 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.536427 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-4vpft" podStartSLOduration=168.536413452 podStartE2EDuration="2m48.536413452s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.535435884 +0000 UTC m=+211.626402369" watchObservedRunningTime="2026-03-20 10:58:45.536413452 +0000 UTC m=+211.627379937" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.539451 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.540981 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.040963659 +0000 UTC m=+212.131930154 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.552989 4772 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-rq497 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.553497 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" podUID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.7:6443/healthz\": dial tcp 10.217.0.7:6443: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.558161 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-2s9dj" podStartSLOduration=6.558145452 podStartE2EDuration="6.558145452s" podCreationTimestamp="2026-03-20 10:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.555009334 +0000 UTC m=+211.645975819" watchObservedRunningTime="2026-03-20 10:58:45.558145452 +0000 UTC m=+211.649111967" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.564145 4772 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-zlsr2 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.564398 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" podUID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.35:8443/healthz\": dial tcp 10.217.0.35:8443: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.583619 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" podStartSLOduration=168.583596367 podStartE2EDuration="2m48.583596367s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.578315138 +0000 UTC m=+211.669281643" watchObservedRunningTime="2026-03-20 10:58:45.583596367 +0000 UTC m=+211.674562852" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.590915 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" event={"ID":"34a9a96c-7d9a-412b-ac98-76747d89f7ba","Type":"ContainerStarted","Data":"f4efc4685fd2d71fd69b27949f6daa3d4e374ba5c0cfcec40d2c57c5b8893fef"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.590996 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" event={"ID":"34a9a96c-7d9a-412b-ac98-76747d89f7ba","Type":"ContainerStarted","Data":"e750318d4d17c0e2499c68e96c9435d5a2f46229c89e279833c772e084ed5536"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.617240 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.643260 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" podStartSLOduration=167.643240333 podStartE2EDuration="2m47.643240333s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.641420402 +0000 UTC m=+211.732386897" watchObservedRunningTime="2026-03-20 10:58:45.643240333 +0000 UTC m=+211.734206818" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.643498 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" podStartSLOduration=168.64349203 podStartE2EDuration="2m48.64349203s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.617440718 +0000 UTC m=+211.708407203" watchObservedRunningTime="2026-03-20 10:58:45.64349203 +0000 UTC m=+211.734458515" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.645990 4772 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-dmbpv container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.646061 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" podUID="34a9a96c-7d9a-412b-ac98-76747d89f7ba" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.25:8443/healthz\": dial tcp 10.217.0.25:8443: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.647276 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.647642 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.147624276 +0000 UTC m=+212.238590761 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.688218 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zlklv" event={"ID":"65a4877c-95f9-4a93-9c74-ecde8d9a7b95","Type":"ContainerStarted","Data":"7b5e1cfed84ae84a805547853959c9ff82d8e486ffd42c52b532fe7ebc674858"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.689330 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.714309 4772 patch_prober.go:28] interesting pod/console-operator-58897d9998-zlklv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.714378 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zlklv" podUID="65a4877c-95f9-4a93-9c74-ecde8d9a7b95" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.715181 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-zlklv" podStartSLOduration=168.715167713 podStartE2EDuration="2m48.715167713s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.714442433 +0000 UTC m=+211.805408918" watchObservedRunningTime="2026-03-20 10:58:45.715167713 +0000 UTC m=+211.806134198" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.731872 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" event={"ID":"77c87234-b79b-4d2f-8ee3-b14aa050925a","Type":"ContainerStarted","Data":"6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.732912 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.745170 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6fhz7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.745217 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.750231 4772 generic.go:334] "Generic (PLEG): container finished" podID="13ebbb5a-3355-4661-92f7-651afafe19e1" containerID="9c5af0315e1d918981fb3ee92822e962363c7497af2d125bd95269c67a1491a0" exitCode=0 Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.750318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" event={"ID":"13ebbb5a-3355-4661-92f7-651afafe19e1","Type":"ContainerDied","Data":"9c5af0315e1d918981fb3ee92822e962363c7497af2d125bd95269c67a1491a0"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.754470 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.755971 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.255955759 +0000 UTC m=+212.346922244 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.761611 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" event={"ID":"9309f110-5a80-46ca-b3de-8087048c13e2","Type":"ContainerStarted","Data":"2d223b9fe83b8af1dcec02c6a9a74e9e64b68b3cffcd9f9698bc74fe2534c11d"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.779548 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" podStartSLOduration=168.779531061 podStartE2EDuration="2m48.779531061s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.753096609 +0000 UTC m=+211.844063094" watchObservedRunningTime="2026-03-20 10:58:45.779531061 +0000 UTC m=+211.870497546" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.803692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" event={"ID":"8f920728-8d6a-43d6-989a-4ee1665c76ab","Type":"ContainerStarted","Data":"36395ea6bec5ca69b6209c61d9f1d51a9be7a236d210c71e422703cbe7f5f606"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.806762 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.814095 4772 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zcpz4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.814143 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" podUID="8f920728-8d6a-43d6-989a-4ee1665c76ab" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": dial tcp 10.217.0.38:5443: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.826560 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" event={"ID":"45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6","Type":"ContainerStarted","Data":"14b71d75fec0f142fc6d5ac68d610f0924202f57a05308c73f06ea1b42b33776"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.827887 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" podStartSLOduration=167.82785978 podStartE2EDuration="2m47.82785978s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.826369067 +0000 UTC m=+211.917335542" watchObservedRunningTime="2026-03-20 10:58:45.82785978 +0000 UTC m=+211.918826275" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.841771 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-p564j" event={"ID":"2563e1e5-aeb2-4d41-857f-171e91d41281","Type":"ContainerStarted","Data":"4423a02944ac4a68fa417e7ec070e57812945bff8f7ca3001a0c1dc33c5abc3d"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.849032 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-lvstj" podStartSLOduration=168.849014664 podStartE2EDuration="2m48.849014664s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.848925441 +0000 UTC m=+211.939891946" watchObservedRunningTime="2026-03-20 10:58:45.849014664 +0000 UTC m=+211.939981149" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.853569 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" event={"ID":"2d8487f2-85df-4de3-a487-34f79c15ef8a","Type":"ContainerStarted","Data":"8298842a8674ff3ea624f4d0f25dfb1e2a155e8636cee456dd98acdb3515097e"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.855739 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.858124 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.356507974 +0000 UTC m=+212.447474569 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.868877 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" event={"ID":"709e108a-3d2e-473b-bff3-28cb1269f598","Type":"ContainerStarted","Data":"83803d125c37c7021c0d75fa3b4cf0b69552bca2b9bcc85e5c008f6a093fe013"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.884199 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-p564j" podStartSLOduration=167.884183412 podStartE2EDuration="2m47.884183412s" podCreationTimestamp="2026-03-20 10:55:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.882793443 +0000 UTC m=+211.973759928" watchObservedRunningTime="2026-03-20 10:58:45.884183412 +0000 UTC m=+211.975149897" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.888040 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" event={"ID":"a5752755-110f-47d1-b9cf-ff3e35aabf8f","Type":"ContainerStarted","Data":"3167b40efd733c90a2b11866bd2b405efbc55d4614b0860d8ebefa13f0a83c40"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.901577 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" event={"ID":"2ce1b289-5291-4da2-afbc-0b5320e730b9","Type":"ContainerStarted","Data":"46a9bd0953d90789f9f86a04fc561c92704ee9e0ab4b9d6c49375e847d17fd5b"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.937017 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-j6hfm" podStartSLOduration=168.936997386 podStartE2EDuration="2m48.936997386s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.936360967 +0000 UTC m=+212.027327462" watchObservedRunningTime="2026-03-20 10:58:45.936997386 +0000 UTC m=+212.027963871" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.937124 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" podStartSLOduration=168.937119349 podStartE2EDuration="2m48.937119349s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.901740055 +0000 UTC m=+211.992706530" watchObservedRunningTime="2026-03-20 10:58:45.937119349 +0000 UTC m=+212.028085834" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.939413 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" event={"ID":"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92","Type":"ContainerStarted","Data":"bb3da171ef89962818c11c04f80efbb126d9d9165fa02cb5928e6ef87518ff44"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.939451 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" event={"ID":"9f4d9edb-87ce-41e1-9cc0-aaf07230ec92","Type":"ContainerStarted","Data":"8d6c2317efcfd744b915a241cb4ed0f641d5b705f4d9dc6a918465e230ee1257"} Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.939832 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2z9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.939876 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2z9" podUID="6872c5d1-0892-4abc-9c68-5fe459ed1107" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.957150 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:45 crc kubenswrapper[4772]: E0320 10:58:45.961876 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.461827233 +0000 UTC m=+212.552793718 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:45 crc kubenswrapper[4772]: I0320 10:58:45.976285 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-m6mzp" podStartSLOduration=168.976264499 podStartE2EDuration="2m48.976264499s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:45.970472045 +0000 UTC m=+212.061438540" watchObservedRunningTime="2026-03-20 10:58:45.976264499 +0000 UTC m=+212.067230984" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.060673 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.060992 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.560980779 +0000 UTC m=+212.651947264 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.114393 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:46 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:46 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:46 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.114454 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.164408 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.164634 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.66462102 +0000 UTC m=+212.755587505 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.265984 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.266370 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.766339017 +0000 UTC m=+212.857305512 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.366875 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.367368 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.867316404 +0000 UTC m=+212.958282909 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.367466 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.367933 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.867916501 +0000 UTC m=+212.958882996 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.377117 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50398: no serving certificate available for the kubelet" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.468071 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.468348 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:46.968333892 +0000 UTC m=+213.059300377 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.569550 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.570006 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.069986077 +0000 UTC m=+213.160952562 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.681043 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.681254 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.181208782 +0000 UTC m=+213.272175257 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.682077 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.682732 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.182713545 +0000 UTC m=+213.273680030 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.783663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.783812 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.283789625 +0000 UTC m=+213.374756110 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.783937 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.783985 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.784015 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.784068 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.784320 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.284312499 +0000 UTC m=+213.375278984 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.785371 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.789743 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.803054 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.885215 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.885486 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:46 crc kubenswrapper[4772]: E0320 10:58:46.886185 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.38616817 +0000 UTC m=+213.477134655 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.893872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.900308 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.916095 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.949428 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" event={"ID":"2cec3fd7-c8be-4ea2-b196-6262ab488fac","Type":"ContainerStarted","Data":"9acdecc8b43c162b3fffc1b38db9b5458fae8886b9487d3bbb0e940566812fc3"} Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.956195 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" event={"ID":"709e108a-3d2e-473b-bff3-28cb1269f598","Type":"ContainerStarted","Data":"40f9577f6a740f34597d11fefaf540e8c81598f4eab37f3bc1a0e97e557dc2d9"} Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.956237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-wk6fq" event={"ID":"709e108a-3d2e-473b-bff3-28cb1269f598","Type":"ContainerStarted","Data":"19f8274b8d9b0a4b8c465bd88d31965f157e1b5ebadd9c814a91f1135a89ca7b"} Mar 20 10:58:46 crc kubenswrapper[4772]: I0320 10:58:46.959920 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-2s9dj" event={"ID":"c42f4b6b-59e0-41f7-a6de-494a458b064b","Type":"ContainerStarted","Data":"d8b4a0a657b5e749ed5765673713a51297097ec414bb161f7247a20d57c0434f"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.002658 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.003725 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.503709192 +0000 UTC m=+213.594675667 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.020710 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" event={"ID":"2ce1b289-5291-4da2-afbc-0b5320e730b9","Type":"ContainerStarted","Data":"edaade755bf4120d835d604b114345e57037ca20c65bc3f108ed8e4b8e324f0c"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.020751 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" event={"ID":"2ce1b289-5291-4da2-afbc-0b5320e730b9","Type":"ContainerStarted","Data":"1be72d3f5405f1ff5d39622a6ad324b9302cb3a0eda6e38e7c678d0eed4ca851"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.031081 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-8l854" podStartSLOduration=170.031067921 podStartE2EDuration="2m50.031067921s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:46.011102107 +0000 UTC m=+212.102068592" watchObservedRunningTime="2026-03-20 10:58:47.031067921 +0000 UTC m=+213.122034406" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.033007 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-ctkwn" podStartSLOduration=170.033002176 podStartE2EDuration="2m50.033002176s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.030558377 +0000 UTC m=+213.121524862" watchObservedRunningTime="2026-03-20 10:58:47.033002176 +0000 UTC m=+213.123968661" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.074187 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" event={"ID":"13ebbb5a-3355-4661-92f7-651afafe19e1","Type":"ContainerStarted","Data":"fe04b46c5671dbe54cc36a4a3247d3f17ca1845557fc8e56f8060d7a82724cb8"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.074226 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" event={"ID":"13ebbb5a-3355-4661-92f7-651afafe19e1","Type":"ContainerStarted","Data":"3c8f08d3c797e4b6c2413045addec1486c87afaccb313cb0466f133f2ac45073"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.105017 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" event={"ID":"ef57ed62-2f1f-4411-974a-f4cf6c624e25","Type":"ContainerStarted","Data":"ccff0e03034ba4efdf7fa6b93e48a72a01e582d3d509fc90ce8d45682784001e"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.105059 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" event={"ID":"ef57ed62-2f1f-4411-974a-f4cf6c624e25","Type":"ContainerStarted","Data":"11e412ed555b3ac2d8df5d0d33bc0778f8f47059708135a9ff0774e38761e746"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.106691 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.108648 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.60863004 +0000 UTC m=+213.699596585 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.122995 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:47 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:47 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:47 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.123049 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.124088 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-mz2wm" podStartSLOduration=170.124060913 podStartE2EDuration="2m50.124060913s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.065056236 +0000 UTC m=+213.156022721" watchObservedRunningTime="2026-03-20 10:58:47.124060913 +0000 UTC m=+213.215027388" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.130813 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" podStartSLOduration=170.130798853 podStartE2EDuration="2m50.130798853s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.123481677 +0000 UTC m=+213.214448152" watchObservedRunningTime="2026-03-20 10:58:47.130798853 +0000 UTC m=+213.221765338" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.153558 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" event={"ID":"bfa70bc3-d525-49db-94fd-316370428815","Type":"ContainerStarted","Data":"1a8f18fccfe0fee5edcb4d43bf00d3c9f97f51d88c60d21cebf293816532a484"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.153716 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-m22rd" podStartSLOduration=170.153682786 podStartE2EDuration="2m50.153682786s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.152602955 +0000 UTC m=+213.243569440" watchObservedRunningTime="2026-03-20 10:58:47.153682786 +0000 UTC m=+213.244649271" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.185099 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.193953 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-zlklv" event={"ID":"65a4877c-95f9-4a93-9c74-ecde8d9a7b95","Type":"ContainerStarted","Data":"51d5e39d0a30bbc78f9fc04c8f165ff6a182094800d4102763c219e7ee4ad28d"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.194980 4772 patch_prober.go:28] interesting pod/console-operator-58897d9998-zlklv container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.195013 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-zlklv" podUID="65a4877c-95f9-4a93-9c74-ecde8d9a7b95" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.16:8443/readyz\": dial tcp 10.217.0.16:8443: connect: connection refused" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.209639 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.209929 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.709918386 +0000 UTC m=+213.800884871 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.211938 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-5nnrh" event={"ID":"24b9f616-8834-4b47-8b4a-1d25d5efb4f2","Type":"ContainerStarted","Data":"6af90ddd47f85f8ac5dc8b147db96b201ab5e5b3c74f788283c0afa0ea54f10f"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.238157 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-n69m8" podStartSLOduration=170.238140288 podStartE2EDuration="2m50.238140288s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.229512846 +0000 UTC m=+213.320479331" watchObservedRunningTime="2026-03-20 10:58:47.238140288 +0000 UTC m=+213.329106763" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.249332 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-7xktg" event={"ID":"0d96629a-14b8-4f25-a58e-65e5eaf8b141","Type":"ContainerStarted","Data":"6d882fa3eb45f8ff05f55e2f1e378f24efcc4f130c91457057b6d4e6bb036917"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.250151 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-7xktg" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.256640 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" event={"ID":"bf2c75a2-ca6a-415f-80ea-830f55899119","Type":"ContainerStarted","Data":"0b987e8e085208901c8524a3b4e78820d2cfdba4e02711754e00935cb11176ec"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.280549 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" event={"ID":"8f920728-8d6a-43d6-989a-4ee1665c76ab","Type":"ContainerStarted","Data":"ec9995ca705d48d0bfab12afde4c11ad428f46e99367b15c05e6ce1cb9f89f24"} Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.280718 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" podUID="1ede39ac-a466-4925-8a5a-1dd6679b1915" containerName="route-controller-manager" containerID="cri-o://82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd" gracePeriod=30 Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.286355 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" podUID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" containerName="controller-manager" containerID="cri-o://49985016d2b2c076bea7c51c747231f67f5ec2b3fa9d53f797a1a25556593f54" gracePeriod=30 Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.289364 4772 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-6fhz7 container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" start-of-body= Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.289407 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.20:8080/healthz\": dial tcp 10.217.0.20:8080: connect: connection refused" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.304194 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-dmbpv" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.304245 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.317315 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.318647 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.8186318 +0000 UTC m=+213.909598285 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.321428 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-c2hdq" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.359026 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-7xktg" podStartSLOduration=8.358997933 podStartE2EDuration="8.358997933s" podCreationTimestamp="2026-03-20 10:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.294167912 +0000 UTC m=+213.385134417" watchObservedRunningTime="2026-03-20 10:58:47.358997933 +0000 UTC m=+213.449964418" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.369157 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.402162 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" podStartSLOduration=170.402144826 podStartE2EDuration="2m50.402144826s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:47.358290624 +0000 UTC m=+213.449257119" watchObservedRunningTime="2026-03-20 10:58:47.402144826 +0000 UTC m=+213.493111311" Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.419340 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.419652 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:47.919639747 +0000 UTC m=+214.010606232 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.520394 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.520948 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.020932483 +0000 UTC m=+214.111898968 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.621769 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.622478 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.122465235 +0000 UTC m=+214.213431720 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.730477 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.730933 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.230918462 +0000 UTC m=+214.321884947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: W0320 10:58:47.775210 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d751cbb_f2e2_430d_9754_c882a5e924a5.slice/crio-937b0fbe3d6d3fa3004ca5507a559dbd5e40eccdf09b4633b1c92eb87e721397 WatchSource:0}: Error finding container 937b0fbe3d6d3fa3004ca5507a559dbd5e40eccdf09b4633b1c92eb87e721397: Status 404 returned error can't find the container with id 937b0fbe3d6d3fa3004ca5507a559dbd5e40eccdf09b4633b1c92eb87e721397 Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.831635 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.832277 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.332259979 +0000 UTC m=+214.423226464 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:47 crc kubenswrapper[4772]: I0320 10:58:47.933568 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:47 crc kubenswrapper[4772]: E0320 10:58:47.933870 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.433854673 +0000 UTC m=+214.524821158 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: W0320 10:58:48.019919 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b6479f0_333b_4a96_9adf_2099afdc2447.slice/crio-85576f7cc884310840995cd168582f0bcf7f282cb36af731a3f229776e390245 WatchSource:0}: Error finding container 85576f7cc884310840995cd168582f0bcf7f282cb36af731a3f229776e390245: Status 404 returned error can't find the container with id 85576f7cc884310840995cd168582f0bcf7f282cb36af731a3f229776e390245 Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.036050 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.036435 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.536424245 +0000 UTC m=+214.627390730 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.109927 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:48 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:48 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:48 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.109986 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.124435 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.137222 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.137331 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.637306729 +0000 UTC m=+214.728273204 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.137505 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.137814 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.637806602 +0000 UTC m=+214.728773087 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.164285 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq"] Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.167744 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1ede39ac-a466-4925-8a5a-1dd6679b1915" containerName="route-controller-manager" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.167777 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1ede39ac-a466-4925-8a5a-1dd6679b1915" containerName="route-controller-manager" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.168250 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1ede39ac-a466-4925-8a5a-1dd6679b1915" containerName="route-controller-manager" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.168857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.191171 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.238012 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-config\") pod \"1ede39ac-a466-4925-8a5a-1dd6679b1915\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239135 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-config" (OuterVolumeSpecName: "config") pod "1ede39ac-a466-4925-8a5a-1dd6679b1915" (UID: "1ede39ac-a466-4925-8a5a-1dd6679b1915"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239240 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede39ac-a466-4925-8a5a-1dd6679b1915-serving-cert\") pod \"1ede39ac-a466-4925-8a5a-1dd6679b1915\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239271 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-client-ca\") pod \"1ede39ac-a466-4925-8a5a-1dd6679b1915\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239331 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vpxsf\" (UniqueName: \"kubernetes.io/projected/1ede39ac-a466-4925-8a5a-1dd6679b1915-kube-api-access-vpxsf\") pod \"1ede39ac-a466-4925-8a5a-1dd6679b1915\" (UID: \"1ede39ac-a466-4925-8a5a-1dd6679b1915\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239437 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239624 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b9brg\" (UniqueName: \"kubernetes.io/projected/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-kube-api-access-b9brg\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239646 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-serving-cert\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239671 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-config\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239694 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-client-ca\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.239733 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.240854 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.740824507 +0000 UTC m=+214.831790992 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.243046 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-client-ca" (OuterVolumeSpecName: "client-ca") pod "1ede39ac-a466-4925-8a5a-1dd6679b1915" (UID: "1ede39ac-a466-4925-8a5a-1dd6679b1915"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.247373 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1ede39ac-a466-4925-8a5a-1dd6679b1915-kube-api-access-vpxsf" (OuterVolumeSpecName: "kube-api-access-vpxsf") pod "1ede39ac-a466-4925-8a5a-1dd6679b1915" (UID: "1ede39ac-a466-4925-8a5a-1dd6679b1915"). InnerVolumeSpecName "kube-api-access-vpxsf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.247025 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1ede39ac-a466-4925-8a5a-1dd6679b1915-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1ede39ac-a466-4925-8a5a-1dd6679b1915" (UID: "1ede39ac-a466-4925-8a5a-1dd6679b1915"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.280464 4772 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-zcpz4 container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.280986 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" podUID="8f920728-8d6a-43d6-989a-4ee1665c76ab" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.38:5443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.316061 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" event={"ID":"ddddd02c-4970-4017-a493-1f9eb50214f3","Type":"ContainerStarted","Data":"3fb99a27e978fd5d7f29c43f83947f38969a7a3bc247007dd75dc2db01e8d814"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341171 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-client-ca\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341271 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341293 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b9brg\" (UniqueName: \"kubernetes.io/projected/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-kube-api-access-b9brg\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341310 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-serving-cert\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341331 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-config\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341371 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1ede39ac-a466-4925-8a5a-1dd6679b1915-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341381 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1ede39ac-a466-4925-8a5a-1dd6679b1915-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.341390 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vpxsf\" (UniqueName: \"kubernetes.io/projected/1ede39ac-a466-4925-8a5a-1dd6679b1915-kube-api-access-vpxsf\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.342567 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-config\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.343199 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-client-ca\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.343480 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.843462461 +0000 UTC m=+214.934428946 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.344704 4772 generic.go:334] "Generic (PLEG): container finished" podID="1ede39ac-a466-4925-8a5a-1dd6679b1915" containerID="82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd" exitCode=0 Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.344782 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" event={"ID":"1ede39ac-a466-4925-8a5a-1dd6679b1915","Type":"ContainerDied","Data":"82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.344803 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" event={"ID":"1ede39ac-a466-4925-8a5a-1dd6679b1915","Type":"ContainerDied","Data":"a9d2f1d8482e6ba0c77c8f1fb9edb5b9aa5ce4be38802e1ff1741f2c811d27e1"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.344819 4772 scope.go:117] "RemoveContainer" containerID="82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.344833 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.349764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-serving-cert\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.366400 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"85576f7cc884310840995cd168582f0bcf7f282cb36af731a3f229776e390245"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.371759 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b9brg\" (UniqueName: \"kubernetes.io/projected/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-kube-api-access-b9brg\") pod \"route-controller-manager-6446bb59c7-xphgq\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.378101 4772 generic.go:334] "Generic (PLEG): container finished" podID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" containerID="49985016d2b2c076bea7c51c747231f67f5ec2b3fa9d53f797a1a25556593f54" exitCode=0 Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.378173 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" event={"ID":"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3","Type":"ContainerDied","Data":"49985016d2b2c076bea7c51c747231f67f5ec2b3fa9d53f797a1a25556593f54"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.382886 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"c190e899e2d0bcfd4b6ac581636a673dde61a255b8bab41090ca086f06571d66"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.382910 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"fb8909ac2ab5e14555d018009dcf267a8cfbdf641f55919097ae98302c7df2c9"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.411318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"937b0fbe3d6d3fa3004ca5507a559dbd5e40eccdf09b4633b1c92eb87e721397"} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.420372 4772 scope.go:117] "RemoveContainer" containerID="82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.421328 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd\": container with ID starting with 82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd not found: ID does not exist" containerID="82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.421353 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd"} err="failed to get container status \"82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd\": rpc error: code = NotFound desc = could not find container \"82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd\": container with ID starting with 82d5dda6c3660951d194a0a5a9bafdc6558793738805384f9b50180eb0518cfd not found: ID does not exist" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.426190 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.430082 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-zlklv" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.442259 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.444072 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:48.944055466 +0000 UTC m=+215.035021951 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.505118 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.530590 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.544955 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-schhc"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.546882 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.547190 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:49.047179603 +0000 UTC m=+215.138146088 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.572013 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.572575 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.581800 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.581798 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.582016 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.605116 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-zcpz4" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.656203 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.657067 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/133225e8-b536-4144-9630-51c10d85f663-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.657166 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133225e8-b536-4144-9630-51c10d85f663-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.663392 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.663539 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:49.163517261 +0000 UTC m=+215.254483746 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.704745 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1ede39ac-a466-4925-8a5a-1dd6679b1915" path="/var/lib/kubelet/pods/1ede39ac-a466-4925-8a5a-1dd6679b1915/volumes" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.729706 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-pt8p5"] Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.730021 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" containerName="controller-manager" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.730039 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" containerName="controller-manager" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.730154 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" containerName="controller-manager" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.731138 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.732814 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.736145 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pt8p5"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760046 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sjfrb\" (UniqueName: \"kubernetes.io/projected/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-kube-api-access-sjfrb\") pod \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760091 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-client-ca\") pod \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760147 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-serving-cert\") pod \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760177 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-config\") pod \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760199 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-proxy-ca-bundles\") pod \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\" (UID: \"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760528 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/133225e8-b536-4144-9630-51c10d85f663-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760555 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6cg2\" (UniqueName: \"kubernetes.io/projected/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-kube-api-access-x6cg2\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760584 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760614 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-catalog-content\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133225e8-b536-4144-9630-51c10d85f663-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.760674 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-utilities\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.761037 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/133225e8-b536-4144-9630-51c10d85f663-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.761571 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-03-20 10:58:49.261539215 +0000 UTC m=+215.352505900 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-xd664" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.761657 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-client-ca" (OuterVolumeSpecName: "client-ca") pod "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" (UID: "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.762348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-config" (OuterVolumeSpecName: "config") pod "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" (UID: "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.766868 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" (UID: "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.768747 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" (UID: "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.776937 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-kube-api-access-sjfrb" (OuterVolumeSpecName: "kube-api-access-sjfrb") pod "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" (UID: "f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3"). InnerVolumeSpecName "kube-api-access-sjfrb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.786783 4772 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.789416 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133225e8-b536-4144-9630-51c10d85f663-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.853392 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-8kc55"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.857327 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.860452 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.863356 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.863815 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-catalog-content\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.863886 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-utilities\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.863932 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x6cg2\" (UniqueName: \"kubernetes.io/projected/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-kube-api-access-x6cg2\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.863981 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.863992 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.864002 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.864013 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sjfrb\" (UniqueName: \"kubernetes.io/projected/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-kube-api-access-sjfrb\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.864023 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:48 crc kubenswrapper[4772]: E0320 10:58:48.864263 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-03-20 10:58:49.364248649 +0000 UTC m=+215.455215134 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.864618 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-catalog-content\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.864833 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-utilities\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.868681 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8kc55"] Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.878034 4772 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-03-20T10:58:48.786821575Z","Handler":null,"Name":""} Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.884959 4772 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.885000 4772 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.887960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6cg2\" (UniqueName: \"kubernetes.io/projected/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-kube-api-access-x6cg2\") pod \"certified-operators-pt8p5\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.896505 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.980472 4772 ???:1] "http: TLS handshake error from 192.168.126.11:50406: no serving certificate available for the kubelet" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.981555 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-catalog-content\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.981600 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.981620 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr9dx\" (UniqueName: \"kubernetes.io/projected/22e23182-8e10-42d5-b34d-f09f6f280262-kube-api-access-fr9dx\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.986733 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-utilities\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.993724 4772 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Mar 20 10:58:48 crc kubenswrapper[4772]: I0320 10:58:48.993775 4772 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.024269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-xd664\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.076994 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-f9wnx"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.079405 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.080056 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9wnx"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.094989 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.095358 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.096888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-utilities\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.097018 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-catalog-content\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.097057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fr9dx\" (UniqueName: \"kubernetes.io/projected/22e23182-8e10-42d5-b34d-f09f6f280262-kube-api-access-fr9dx\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.099364 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-utilities\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.099736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-catalog-content\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.109166 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:49 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:49 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:49 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.109241 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.119509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.127228 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr9dx\" (UniqueName: \"kubernetes.io/projected/22e23182-8e10-42d5-b34d-f09f6f280262-kube-api-access-fr9dx\") pod \"community-operators-8kc55\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.159512 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Mar 20 10:58:49 crc kubenswrapper[4772]: W0320 10:58:49.185723 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod133225e8_b536_4144_9630_51c10d85f663.slice/crio-3876d7c0eb0c78c701fd13c0eac7d159f759eabaf0f9af112a0fdd7f03abf097 WatchSource:0}: Error finding container 3876d7c0eb0c78c701fd13c0eac7d159f759eabaf0f9af112a0fdd7f03abf097: Status 404 returned error can't find the container with id 3876d7c0eb0c78c701fd13c0eac7d159f759eabaf0f9af112a0fdd7f03abf097 Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.189158 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.199891 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-catalog-content\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.199983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gf9mw\" (UniqueName: \"kubernetes.io/projected/b1148455-d28e-4866-8b3e-cbabeaad84c7-kube-api-access-gf9mw\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.200063 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-utilities\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.203596 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.213087 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.256984 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7mp8g"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.257999 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: W0320 10:58:49.262048 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf95f3b03_0f29_4683_b1eb_11ebc4d66b3f.slice/crio-5ac7a556e5620787412e050b275df1204ae61702c039b052461057f3edb6a5df WatchSource:0}: Error finding container 5ac7a556e5620787412e050b275df1204ae61702c039b052461057f3edb6a5df: Status 404 returned error can't find the container with id 5ac7a556e5620787412e050b275df1204ae61702c039b052461057f3edb6a5df Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.274670 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mp8g"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.301721 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-utilities\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.301763 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-catalog-content\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.301813 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gf9mw\" (UniqueName: \"kubernetes.io/projected/b1148455-d28e-4866-8b3e-cbabeaad84c7-kube-api-access-gf9mw\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.302573 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-catalog-content\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.302641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-utilities\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.335035 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gf9mw\" (UniqueName: \"kubernetes.io/projected/b1148455-d28e-4866-8b3e-cbabeaad84c7-kube-api-access-gf9mw\") pod \"certified-operators-f9wnx\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.403554 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wnxsh\" (UniqueName: \"kubernetes.io/projected/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-kube-api-access-wnxsh\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.403624 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-catalog-content\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.403662 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-utilities\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.411162 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.424324 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-pt8p5"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.440295 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"fe301dec20421807acaf0e09fb34bb3a7981d05d159d08790d1645e77d9bd0ba"} Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.440930 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.445008 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.445041 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-zlsr2" event={"ID":"f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3","Type":"ContainerDied","Data":"496c1f4fcea66366a2317b5b80cba50a4e9c9520dea045ccaed0f798745d8a6a"} Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.445093 4772 scope.go:117] "RemoveContainer" containerID="49985016d2b2c076bea7c51c747231f67f5ec2b3fa9d53f797a1a25556593f54" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.445991 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"133225e8-b536-4144-9630-51c10d85f663","Type":"ContainerStarted","Data":"3876d7c0eb0c78c701fd13c0eac7d159f759eabaf0f9af112a0fdd7f03abf097"} Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.447536 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"442d16d99c41edd31cc90431946fde83a0e94e5e1a79bbe5e1f9e69a4301e8a6"} Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.450179 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" event={"ID":"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f","Type":"ContainerStarted","Data":"5ac7a556e5620787412e050b275df1204ae61702c039b052461057f3edb6a5df"} Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.463251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" event={"ID":"ddddd02c-4970-4017-a493-1f9eb50214f3","Type":"ContainerStarted","Data":"98fb571266eff252b35ef02a82c71d0bdcb6fa95e9f87b7047828d7710ef3648"} Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.505020 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wnxsh\" (UniqueName: \"kubernetes.io/projected/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-kube-api-access-wnxsh\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.505533 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-catalog-content\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.505577 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-utilities\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.506532 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-utilities\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.506606 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-catalog-content\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.527610 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wnxsh\" (UniqueName: \"kubernetes.io/projected/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-kube-api-access-wnxsh\") pod \"community-operators-7mp8g\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.663447 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zlsr2"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.672469 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-zlsr2"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.676227 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.692554 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-8kc55"] Mar 20 10:58:49 crc kubenswrapper[4772]: I0320 10:58:49.739018 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xd664"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.000937 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-f9wnx"] Mar 20 10:58:50 crc kubenswrapper[4772]: W0320 10:58:50.025952 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1148455_d28e_4866_8b3e_cbabeaad84c7.slice/crio-4fa33889d623a5c1e7a178dc55801293a86ec6dbe5f716335c9a02e325b3e93f WatchSource:0}: Error finding container 4fa33889d623a5c1e7a178dc55801293a86ec6dbe5f716335c9a02e325b3e93f: Status 404 returned error can't find the container with id 4fa33889d623a5c1e7a178dc55801293a86ec6dbe5f716335c9a02e325b3e93f Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.107040 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:50 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:50 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:50 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.107100 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.245716 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.246533 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.248566 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.248784 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.251216 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.251343 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.251457 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7mp8g"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.251672 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.252070 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.262866 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.264961 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.316774 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jzl92\" (UniqueName: \"kubernetes.io/projected/1b52a731-93cf-4a4b-a491-b73c221bc21e-kube-api-access-jzl92\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.316885 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-client-ca\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.316907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-config\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.316940 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-proxy-ca-bundles\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.316964 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b52a731-93cf-4a4b-a491-b73c221bc21e-serving-cert\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.418527 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jzl92\" (UniqueName: \"kubernetes.io/projected/1b52a731-93cf-4a4b-a491-b73c221bc21e-kube-api-access-jzl92\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.419256 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-client-ca\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.419282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-config\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.420402 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-client-ca\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.420505 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-proxy-ca-bundles\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.421522 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-proxy-ca-bundles\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.420537 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b52a731-93cf-4a4b-a491-b73c221bc21e-serving-cert\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.421615 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-config\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.441347 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b52a731-93cf-4a4b-a491-b73c221bc21e-serving-cert\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.450681 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jzl92\" (UniqueName: \"kubernetes.io/projected/1b52a731-93cf-4a4b-a491-b73c221bc21e-kube-api-access-jzl92\") pod \"controller-manager-7d95bb6b4-mgtf8\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.487367 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" event={"ID":"ddddd02c-4970-4017-a493-1f9eb50214f3","Type":"ContainerStarted","Data":"17f4f3d2431de8d53ccb878f524fd98a7c9f998ca742d6df76dfc3df7a3573c5"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.492063 4772 generic.go:334] "Generic (PLEG): container finished" podID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerID="49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2" exitCode=0 Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.492165 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerDied","Data":"49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.492221 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerStarted","Data":"4fa33889d623a5c1e7a178dc55801293a86ec6dbe5f716335c9a02e325b3e93f"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.494635 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mp8g" event={"ID":"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe","Type":"ContainerStarted","Data":"fec2d7bddc8a1f16f63a769dc043d245a99c5c5253dd8211d4e78f396fc50b66"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.505061 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" event={"ID":"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f","Type":"ContainerStarted","Data":"8a3b11ccb8001ae517392f1a66c37b842284eb4dfbff3df9cdba744806b08c15"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.505552 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.511207 4772 generic.go:334] "Generic (PLEG): container finished" podID="22e23182-8e10-42d5-b34d-f09f6f280262" containerID="b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55" exitCode=0 Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.511289 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kc55" event={"ID":"22e23182-8e10-42d5-b34d-f09f6f280262","Type":"ContainerDied","Data":"b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.511344 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kc55" event={"ID":"22e23182-8e10-42d5-b34d-f09f6f280262","Type":"ContainerStarted","Data":"1ab9a8952b9359270517ab843b7a5a3af7b0c0c5b107b6d51a61939e2bfe6115"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.513562 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.516335 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" event={"ID":"06381439-6997-45aa-8dce-62b012b0ac68","Type":"ContainerDied","Data":"e53c63cafd33fbfa4e94f437ac55a29be8c86aee585aa730bb52cab188476104"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.516305 4772 generic.go:334] "Generic (PLEG): container finished" podID="06381439-6997-45aa-8dce-62b012b0ac68" containerID="e53c63cafd33fbfa4e94f437ac55a29be8c86aee585aa730bb52cab188476104" exitCode=0 Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.516584 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-f7txt" podStartSLOduration=11.516571428 podStartE2EDuration="11.516571428s" podCreationTimestamp="2026-03-20 10:58:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:50.513632046 +0000 UTC m=+216.604598531" watchObservedRunningTime="2026-03-20 10:58:50.516571428 +0000 UTC m=+216.607537913" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.522048 4772 generic.go:334] "Generic (PLEG): container finished" podID="133225e8-b536-4144-9630-51c10d85f663" containerID="b30f84634d9b56f3756bceea40fe2428f10320a5f62ceb65a5df62788cdc947f" exitCode=0 Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.522114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"133225e8-b536-4144-9630-51c10d85f663","Type":"ContainerDied","Data":"b30f84634d9b56f3756bceea40fe2428f10320a5f62ceb65a5df62788cdc947f"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.523437 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" event={"ID":"15ca766d-44d0-4433-b2f8-6348e66ee047","Type":"ContainerStarted","Data":"3b3b86a309e5e6ddc2d6ddf779075cb856ec9bdaef8470f2db27bbeaf080fcc3"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.523461 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" event={"ID":"15ca766d-44d0-4433-b2f8-6348e66ee047","Type":"ContainerStarted","Data":"bd3ade33c5717b92a02e0632ca0d83cfb3c1fc7498d5c0fa39f76c710696bfb0"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.524034 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.525133 4772 generic.go:334] "Generic (PLEG): container finished" podID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerID="ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db" exitCode=0 Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.525800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt8p5" event={"ID":"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260","Type":"ContainerDied","Data":"ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.525822 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt8p5" event={"ID":"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260","Type":"ContainerStarted","Data":"150fb99e7990748158fb20e8f50679d99cdda0612a5f03d3a19aac5027c3dfe8"} Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.591135 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.622667 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" podStartSLOduration=173.622648958 podStartE2EDuration="2m53.622648958s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:50.620687613 +0000 UTC m=+216.711654098" watchObservedRunningTime="2026-03-20 10:58:50.622648958 +0000 UTC m=+216.713615433" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.626265 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" podStartSLOduration=6.626256239 podStartE2EDuration="6.626256239s" podCreationTimestamp="2026-03-20 10:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:50.587037038 +0000 UTC m=+216.678003523" watchObservedRunningTime="2026-03-20 10:58:50.626256239 +0000 UTC m=+216.717222714" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.649035 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.649665 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3" path="/var/lib/kubelet/pods/f6f328de-7ebc-4ce0-8b6e-5902dd8cabe3/volumes" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.760082 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-drz9m" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.837527 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.850803 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-pr4qj"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.851998 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.855671 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 10:58:50 crc kubenswrapper[4772]: W0320 10:58:50.859552 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b52a731_93cf_4a4b_a491_b73c221bc21e.slice/crio-99037965bfd20fa755386370ea804f5d48209f32c85af703a4a6a65688bfbf6b WatchSource:0}: Error finding container 99037965bfd20fa755386370ea804f5d48209f32c85af703a4a6a65688bfbf6b: Status 404 returned error can't find the container with id 99037965bfd20fa755386370ea804f5d48209f32c85af703a4a6a65688bfbf6b Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.861040 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr4qj"] Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.931951 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r4v48\" (UniqueName: \"kubernetes.io/projected/4eadb8ff-b747-4293-800f-b9894eb72ee3-kube-api-access-r4v48\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.932026 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-utilities\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:50 crc kubenswrapper[4772]: I0320 10:58:50.932242 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-catalog-content\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.033924 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r4v48\" (UniqueName: \"kubernetes.io/projected/4eadb8ff-b747-4293-800f-b9894eb72ee3-kube-api-access-r4v48\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.033984 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-utilities\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.034025 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-catalog-content\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.034467 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-catalog-content\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.034686 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-utilities\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.053943 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r4v48\" (UniqueName: \"kubernetes.io/projected/4eadb8ff-b747-4293-800f-b9894eb72ee3-kube-api-access-r4v48\") pod \"redhat-marketplace-pr4qj\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.107008 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:51 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:51 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:51 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.107104 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.178383 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.269661 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-hrd6b"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.271330 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.279190 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrd6b"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.338357 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czr5h\" (UniqueName: \"kubernetes.io/projected/04199621-c96a-4c6e-b7c0-3559112cc4fc-kube-api-access-czr5h\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.338435 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-catalog-content\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.338457 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-utilities\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.378151 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2z9 container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.378196 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-lg2z9" podUID="6872c5d1-0892-4abc-9c68-5fe459ed1107" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.378226 4772 patch_prober.go:28] interesting pod/downloads-7954f5f757-lg2z9 container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" start-of-body= Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.378271 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-lg2z9" podUID="6872c5d1-0892-4abc-9c68-5fe459ed1107" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.40:8080/\": dial tcp 10.217.0.40:8080: connect: connection refused" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.439200 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-catalog-content\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.439238 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-utilities\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.439318 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czr5h\" (UniqueName: \"kubernetes.io/projected/04199621-c96a-4c6e-b7c0-3559112cc4fc-kube-api-access-czr5h\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.440061 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-catalog-content\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.440274 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-utilities\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.467080 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czr5h\" (UniqueName: \"kubernetes.io/projected/04199621-c96a-4c6e-b7c0-3559112cc4fc-kube-api-access-czr5h\") pod \"redhat-marketplace-hrd6b\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.495970 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.496575 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.504065 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.506811 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.507070 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.575829 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" event={"ID":"1b52a731-93cf-4a4b-a491-b73c221bc21e","Type":"ContainerStarted","Data":"6ac0f5820b4b8a91ff4e02d2b0c429104d06a144879a97466f4c8c7653e57a1e"} Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.575902 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" event={"ID":"1b52a731-93cf-4a4b-a491-b73c221bc21e","Type":"ContainerStarted","Data":"99037965bfd20fa755386370ea804f5d48209f32c85af703a4a6a65688bfbf6b"} Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.576329 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.577764 4772 generic.go:334] "Generic (PLEG): container finished" podID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerID="4d3c5a2cb8492051edd44fb39984d3864e657aaf532571d686710e2c858a81d1" exitCode=0 Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.578500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mp8g" event={"ID":"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe","Type":"ContainerDied","Data":"4d3c5a2cb8492051edd44fb39984d3864e657aaf532571d686710e2c858a81d1"} Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.596140 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.597148 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.597589 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.602476 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.616940 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr4qj"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.619972 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.620005 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.622752 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" podStartSLOduration=7.622722643 podStartE2EDuration="7.622722643s" podCreationTimestamp="2026-03-20 10:58:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:51.620037827 +0000 UTC m=+217.711004322" watchObservedRunningTime="2026-03-20 10:58:51.622722643 +0000 UTC m=+217.713689128" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.625144 4772 patch_prober.go:28] interesting pod/console-f9d7485db-fgwgm container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" start-of-body= Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.625202 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-fgwgm" podUID="f7c20397-4233-45e6-a7f9-5e88942e7abf" containerName="console" probeResult="failure" output="Get \"https://10.217.0.12:8443/health\": dial tcp 10.217.0.12:8443: connect: connection refused" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.626914 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.643449 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.644223 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.745478 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.745656 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.746904 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.773268 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.853875 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.863416 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-6l8kp"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.869884 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.883005 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.889379 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.889411 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.910917 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6l8kp"] Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.916248 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.956415 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-catalog-content\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.957471 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ggr6f\" (UniqueName: \"kubernetes.io/projected/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-kube-api-access-ggr6f\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:51 crc kubenswrapper[4772]: I0320 10:58:51.959204 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-utilities\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.047941 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.062830 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-utilities\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.063010 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-catalog-content\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.063040 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ggr6f\" (UniqueName: \"kubernetes.io/projected/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-kube-api-access-ggr6f\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.063979 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-utilities\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.064439 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrd6b"] Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.068015 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-catalog-content\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.102832 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ggr6f\" (UniqueName: \"kubernetes.io/projected/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-kube-api-access-ggr6f\") pod \"redhat-operators-6l8kp\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.119044 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.129160 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:52 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:52 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:52 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.129242 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.164310 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06381439-6997-45aa-8dce-62b012b0ac68-config-volume\") pod \"06381439-6997-45aa-8dce-62b012b0ac68\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.164471 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06381439-6997-45aa-8dce-62b012b0ac68-secret-volume\") pod \"06381439-6997-45aa-8dce-62b012b0ac68\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.164533 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dl6dx\" (UniqueName: \"kubernetes.io/projected/06381439-6997-45aa-8dce-62b012b0ac68-kube-api-access-dl6dx\") pod \"06381439-6997-45aa-8dce-62b012b0ac68\" (UID: \"06381439-6997-45aa-8dce-62b012b0ac68\") " Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.166621 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06381439-6997-45aa-8dce-62b012b0ac68-config-volume" (OuterVolumeSpecName: "config-volume") pod "06381439-6997-45aa-8dce-62b012b0ac68" (UID: "06381439-6997-45aa-8dce-62b012b0ac68"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.181013 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/06381439-6997-45aa-8dce-62b012b0ac68-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "06381439-6997-45aa-8dce-62b012b0ac68" (UID: "06381439-6997-45aa-8dce-62b012b0ac68"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.198055 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06381439-6997-45aa-8dce-62b012b0ac68-kube-api-access-dl6dx" (OuterVolumeSpecName: "kube-api-access-dl6dx") pod "06381439-6997-45aa-8dce-62b012b0ac68" (UID: "06381439-6997-45aa-8dce-62b012b0ac68"). InnerVolumeSpecName "kube-api-access-dl6dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.213526 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.268257 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/06381439-6997-45aa-8dce-62b012b0ac68-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.268282 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dl6dx\" (UniqueName: \"kubernetes.io/projected/06381439-6997-45aa-8dce-62b012b0ac68-kube-api-access-dl6dx\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.268292 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/06381439-6997-45aa-8dce-62b012b0ac68-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.288426 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qdlzf"] Mar 20 10:58:52 crc kubenswrapper[4772]: E0320 10:58:52.288671 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06381439-6997-45aa-8dce-62b012b0ac68" containerName="collect-profiles" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.288688 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="06381439-6997-45aa-8dce-62b012b0ac68" containerName="collect-profiles" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.288855 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="06381439-6997-45aa-8dce-62b012b0ac68" containerName="collect-profiles" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.292947 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.293613 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.310220 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdlzf"] Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.368677 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/133225e8-b536-4144-9630-51c10d85f663-kubelet-dir\") pod \"133225e8-b536-4144-9630-51c10d85f663\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.368783 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133225e8-b536-4144-9630-51c10d85f663-kube-api-access\") pod \"133225e8-b536-4144-9630-51c10d85f663\" (UID: \"133225e8-b536-4144-9630-51c10d85f663\") " Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.368933 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/133225e8-b536-4144-9630-51c10d85f663-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "133225e8-b536-4144-9630-51c10d85f663" (UID: "133225e8-b536-4144-9630-51c10d85f663"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.368979 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpgpg\" (UniqueName: \"kubernetes.io/projected/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-kube-api-access-hpgpg\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.369172 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-utilities\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.369431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-catalog-content\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.369611 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/133225e8-b536-4144-9630-51c10d85f663-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.393760 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/133225e8-b536-4144-9630-51c10d85f663-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "133225e8-b536-4144-9630-51c10d85f663" (UID: "133225e8-b536-4144-9630-51c10d85f663"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.473973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpgpg\" (UniqueName: \"kubernetes.io/projected/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-kube-api-access-hpgpg\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.474057 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-utilities\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.474102 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-catalog-content\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.474134 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/133225e8-b536-4144-9630-51c10d85f663-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.474517 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-catalog-content\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.474953 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-utilities\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.529043 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpgpg\" (UniqueName: \"kubernetes.io/projected/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-kube-api-access-hpgpg\") pod \"redhat-operators-qdlzf\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.578216 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.609872 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.617642 4772 generic.go:334] "Generic (PLEG): container finished" podID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerID="8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0" exitCode=0 Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.617752 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr4qj" event={"ID":"4eadb8ff-b747-4293-800f-b9894eb72ee3","Type":"ContainerDied","Data":"8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0"} Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.617789 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr4qj" event={"ID":"4eadb8ff-b747-4293-800f-b9894eb72ee3","Type":"ContainerStarted","Data":"60a5c5c418964f777a5f5a040d631fd21f9a272690ffe1dbd0db82d29a6fe4f9"} Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.629255 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.630766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg" event={"ID":"06381439-6997-45aa-8dce-62b012b0ac68","Type":"ContainerDied","Data":"6db455a1c0890d2f4c332ded4da5f2ba5e4832736b33b1cd3b9d8218edc37c8e"} Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.630851 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6db455a1c0890d2f4c332ded4da5f2ba5e4832736b33b1cd3b9d8218edc37c8e" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.644395 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.649527 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"133225e8-b536-4144-9630-51c10d85f663","Type":"ContainerDied","Data":"3876d7c0eb0c78c701fd13c0eac7d159f759eabaf0f9af112a0fdd7f03abf097"} Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.649580 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="3876d7c0eb0c78c701fd13c0eac7d159f759eabaf0f9af112a0fdd7f03abf097" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.666714 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerStarted","Data":"b2f4a285ef424ac7f8e0a116f7a0546f4d58d36738d88d8c4e851e79b240b085"} Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.676068 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-fb454" Mar 20 10:58:52 crc kubenswrapper[4772]: I0320 10:58:52.683482 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-s7p9n" Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.108744 4772 patch_prober.go:28] interesting pod/router-default-5444994796-9drnj container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Mar 20 10:58:53 crc kubenswrapper[4772]: [-]has-synced failed: reason withheld Mar 20 10:58:53 crc kubenswrapper[4772]: [+]process-running ok Mar 20 10:58:53 crc kubenswrapper[4772]: healthz check failed Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.109184 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-9drnj" podUID="81a4f4cd-feb2-4c87-99f7-04202818012f" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.126872 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-6l8kp"] Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.374710 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qdlzf"] Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.699151 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerStarted","Data":"002c321de43dff0295934eb0ec3482f04c29b4e9c635a8567d0e5bac56a8d3fa"} Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.716329 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerStarted","Data":"32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5"} Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.716378 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerStarted","Data":"69add3e429d14dbbc50247c1c652f40e380e2e90b279e9569f66784030708d14"} Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.720711 4772 generic.go:334] "Generic (PLEG): container finished" podID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerID="c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386" exitCode=0 Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.720785 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerDied","Data":"c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386"} Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.731654 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"20ef433c-e850-4a0b-9104-4e53f5c8b82d","Type":"ContainerStarted","Data":"efafedcbefd6b49e80cb411ccca4c6a1dab984de3869a7901f6268ef2fcd8c51"} Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.731701 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"20ef433c-e850-4a0b-9104-4e53f5c8b82d","Type":"ContainerStarted","Data":"e98c6149c559d1c36e6efc07b967f80f6b183e043a20d417514fe20bd3b46120"} Mar 20 10:58:53 crc kubenswrapper[4772]: I0320 10:58:53.779085 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=2.77906788 podStartE2EDuration="2.77906788s" podCreationTimestamp="2026-03-20 10:58:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:58:53.754478669 +0000 UTC m=+219.845445144" watchObservedRunningTime="2026-03-20 10:58:53.77906788 +0000 UTC m=+219.870034365" Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.106602 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.108780 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-9drnj" Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.132368 4772 ???:1] "http: TLS handshake error from 192.168.126.11:59200: no serving certificate available for the kubelet" Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.743509 4772 generic.go:334] "Generic (PLEG): container finished" podID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerID="32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5" exitCode=0 Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.743528 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerDied","Data":"32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5"} Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.748004 4772 generic.go:334] "Generic (PLEG): container finished" podID="20ef433c-e850-4a0b-9104-4e53f5c8b82d" containerID="efafedcbefd6b49e80cb411ccca4c6a1dab984de3869a7901f6268ef2fcd8c51" exitCode=0 Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.748051 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"20ef433c-e850-4a0b-9104-4e53f5c8b82d","Type":"ContainerDied","Data":"efafedcbefd6b49e80cb411ccca4c6a1dab984de3869a7901f6268ef2fcd8c51"} Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.756717 4772 generic.go:334] "Generic (PLEG): container finished" podID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerID="fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e" exitCode=0 Mar 20 10:58:54 crc kubenswrapper[4772]: I0320 10:58:54.758086 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerDied","Data":"fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e"} Mar 20 10:58:55 crc kubenswrapper[4772]: I0320 10:58:55.591646 4772 ???:1] "http: TLS handshake error from 192.168.126.11:59216: no serving certificate available for the kubelet" Mar 20 10:58:57 crc kubenswrapper[4772]: I0320 10:58:57.484023 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-7xktg" Mar 20 10:59:00 crc kubenswrapper[4772]: I0320 10:59:00.231702 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:59:00 crc kubenswrapper[4772]: I0320 10:59:00.238051 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2ac5550b-02eb-48b4-b62a-e21dd4429249-metrics-certs\") pod \"network-metrics-daemon-m8kjd\" (UID: \"2ac5550b-02eb-48b4-b62a-e21dd4429249\") " pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:59:00 crc kubenswrapper[4772]: I0320 10:59:00.370357 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-m8kjd" Mar 20 10:59:01 crc kubenswrapper[4772]: I0320 10:59:01.381076 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-lg2z9" Mar 20 10:59:01 crc kubenswrapper[4772]: I0320 10:59:01.627563 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:59:01 crc kubenswrapper[4772]: I0320 10:59:01.632252 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.184990 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.301739 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kubelet-dir\") pod \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.302158 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kube-api-access\") pod \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\" (UID: \"20ef433c-e850-4a0b-9104-4e53f5c8b82d\") " Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.301820 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "20ef433c-e850-4a0b-9104-4e53f5c8b82d" (UID: "20ef433c-e850-4a0b-9104-4e53f5c8b82d"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.302397 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.308249 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "20ef433c-e850-4a0b-9104-4e53f5c8b82d" (UID: "20ef433c-e850-4a0b-9104-4e53f5c8b82d"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.403312 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/20ef433c-e850-4a0b-9104-4e53f5c8b82d-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.847000 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"20ef433c-e850-4a0b-9104-4e53f5c8b82d","Type":"ContainerDied","Data":"e98c6149c559d1c36e6efc07b967f80f6b183e043a20d417514fe20bd3b46120"} Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.847047 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e98c6149c559d1c36e6efc07b967f80f6b183e043a20d417514fe20bd3b46120" Mar 20 10:59:02 crc kubenswrapper[4772]: I0320 10:59:02.847125 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Mar 20 10:59:04 crc kubenswrapper[4772]: I0320 10:59:04.273348 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8"] Mar 20 10:59:04 crc kubenswrapper[4772]: I0320 10:59:04.273527 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" podUID="1b52a731-93cf-4a4b-a491-b73c221bc21e" containerName="controller-manager" containerID="cri-o://6ac0f5820b4b8a91ff4e02d2b0c429104d06a144879a97466f4c8c7653e57a1e" gracePeriod=30 Mar 20 10:59:04 crc kubenswrapper[4772]: I0320 10:59:04.291827 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq"] Mar 20 10:59:04 crc kubenswrapper[4772]: I0320 10:59:04.299200 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerName="route-controller-manager" containerID="cri-o://8a3b11ccb8001ae517392f1a66c37b842284eb4dfbff3df9cdba744806b08c15" gracePeriod=30 Mar 20 10:59:04 crc kubenswrapper[4772]: I0320 10:59:04.390006 4772 ???:1] "http: TLS handshake error from 192.168.126.11:37150: no serving certificate available for the kubelet" Mar 20 10:59:05 crc kubenswrapper[4772]: I0320 10:59:05.870616 4772 generic.go:334] "Generic (PLEG): container finished" podID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerID="8a3b11ccb8001ae517392f1a66c37b842284eb4dfbff3df9cdba744806b08c15" exitCode=0 Mar 20 10:59:05 crc kubenswrapper[4772]: I0320 10:59:05.870705 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" event={"ID":"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f","Type":"ContainerDied","Data":"8a3b11ccb8001ae517392f1a66c37b842284eb4dfbff3df9cdba744806b08c15"} Mar 20 10:59:05 crc kubenswrapper[4772]: I0320 10:59:05.872468 4772 generic.go:334] "Generic (PLEG): container finished" podID="1b52a731-93cf-4a4b-a491-b73c221bc21e" containerID="6ac0f5820b4b8a91ff4e02d2b0c429104d06a144879a97466f4c8c7653e57a1e" exitCode=0 Mar 20 10:59:05 crc kubenswrapper[4772]: I0320 10:59:05.872497 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" event={"ID":"1b52a731-93cf-4a4b-a491-b73c221bc21e","Type":"ContainerDied","Data":"6ac0f5820b4b8a91ff4e02d2b0c429104d06a144879a97466f4c8c7653e57a1e"} Mar 20 10:59:08 crc kubenswrapper[4772]: I0320 10:59:08.506624 4772 patch_prober.go:28] interesting pod/route-controller-manager-6446bb59c7-xphgq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" start-of-body= Mar 20 10:59:08 crc kubenswrapper[4772]: I0320 10:59:08.508039 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: connect: connection refused" Mar 20 10:59:09 crc kubenswrapper[4772]: I0320 10:59:09.225329 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 10:59:09 crc kubenswrapper[4772]: I0320 10:59:09.564780 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:59:09 crc kubenswrapper[4772]: I0320 10:59:09.564895 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:59:10 crc kubenswrapper[4772]: I0320 10:59:10.592316 4772 patch_prober.go:28] interesting pod/controller-manager-7d95bb6b4-mgtf8 container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" start-of-body= Mar 20 10:59:10 crc kubenswrapper[4772]: I0320 10:59:10.592430 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" podUID="1b52a731-93cf-4a4b-a491-b73c221bc21e" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.51:8443/healthz\": dial tcp 10.217.0.51:8443: connect: connection refused" Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.044387 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/openshift4/ose-cli:latest" Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.045164 4772 kuberuntime_manager.go:1274] "Unhandled Error" err=< Mar 20 10:59:19 crc kubenswrapper[4772]: container &Container{Name:oc,Image:registry.redhat.io/openshift4/ose-cli:latest,Command:[/bin/bash -c oc get csr -o go-template='{{range .items}}{{if not .status}}{{.metadata.name}}{{"\n"}}{{end}}{{end}}' | xargs --no-run-if-empty oc adm certificate approve Mar 20 10:59:19 crc kubenswrapper[4772]: ],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-d66cw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:nil,Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod auto-csr-approver-29566738-zlqcq_openshift-infra(9309f110-5a80-46ca-b3de-8087048c13e2): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled Mar 20 10:59:19 crc kubenswrapper[4772]: > logger="UnhandledError" Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.046517 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" podUID="9309f110-5a80-46ca-b3de-8087048c13e2" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.087038 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.092362 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.112937 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k"] Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.113145 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="20ef433c-e850-4a0b-9104-4e53f5c8b82d" containerName="pruner" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113157 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="20ef433c-e850-4a0b-9104-4e53f5c8b82d" containerName="pruner" Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.113166 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="133225e8-b536-4144-9630-51c10d85f663" containerName="pruner" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113172 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="133225e8-b536-4144-9630-51c10d85f663" containerName="pruner" Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.113182 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerName="route-controller-manager" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113188 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerName="route-controller-manager" Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.113209 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1b52a731-93cf-4a4b-a491-b73c221bc21e" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113216 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b52a731-93cf-4a4b-a491-b73c221bc21e" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113303 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="1b52a731-93cf-4a4b-a491-b73c221bc21e" containerName="controller-manager" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113315 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="20ef433c-e850-4a0b-9104-4e53f5c8b82d" containerName="pruner" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113328 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="133225e8-b536-4144-9630-51c10d85f663" containerName="pruner" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113336 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerName="route-controller-manager" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.113678 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.128074 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k"] Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.147369 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.147363 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" event={"ID":"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f","Type":"ContainerDied","Data":"5ac7a556e5620787412e050b275df1204ae61702c039b052461057f3edb6a5df"} Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.147520 4772 scope.go:117] "RemoveContainer" containerID="8a3b11ccb8001ae517392f1a66c37b842284eb4dfbff3df9cdba744806b08c15" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.149154 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.149147 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8" event={"ID":"1b52a731-93cf-4a4b-a491-b73c221bc21e","Type":"ContainerDied","Data":"99037965bfd20fa755386370ea804f5d48209f32c85af703a4a6a65688bfbf6b"} Mar 20 10:59:19 crc kubenswrapper[4772]: E0320 10:59:19.151717 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"oc\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/openshift4/ose-cli:latest\\\"\"" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" podUID="9309f110-5a80-46ca-b3de-8087048c13e2" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.247882 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-client-ca\") pod \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.247942 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-config\") pod \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248001 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b52a731-93cf-4a4b-a491-b73c221bc21e-serving-cert\") pod \"1b52a731-93cf-4a4b-a491-b73c221bc21e\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248040 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-client-ca\") pod \"1b52a731-93cf-4a4b-a491-b73c221bc21e\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248070 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jzl92\" (UniqueName: \"kubernetes.io/projected/1b52a731-93cf-4a4b-a491-b73c221bc21e-kube-api-access-jzl92\") pod \"1b52a731-93cf-4a4b-a491-b73c221bc21e\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-serving-cert\") pod \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248214 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-proxy-ca-bundles\") pod \"1b52a731-93cf-4a4b-a491-b73c221bc21e\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248239 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b9brg\" (UniqueName: \"kubernetes.io/projected/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-kube-api-access-b9brg\") pod \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\" (UID: \"f95f3b03-0f29-4683-b1eb-11ebc4d66b3f\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248260 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-config\") pod \"1b52a731-93cf-4a4b-a491-b73c221bc21e\" (UID: \"1b52a731-93cf-4a4b-a491-b73c221bc21e\") " Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248469 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-proxy-ca-bundles\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248814 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-client-ca\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248863 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mc4md\" (UniqueName: \"kubernetes.io/projected/bc9a46a8-533c-4717-88a5-466e196ef9f5-kube-api-access-mc4md\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248910 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc9a46a8-533c-4717-88a5-466e196ef9f5-serving-cert\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.248957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-config\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.249414 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "1b52a731-93cf-4a4b-a491-b73c221bc21e" (UID: "1b52a731-93cf-4a4b-a491-b73c221bc21e"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.250557 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-client-ca" (OuterVolumeSpecName: "client-ca") pod "1b52a731-93cf-4a4b-a491-b73c221bc21e" (UID: "1b52a731-93cf-4a4b-a491-b73c221bc21e"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.250356 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-config" (OuterVolumeSpecName: "config") pod "1b52a731-93cf-4a4b-a491-b73c221bc21e" (UID: "1b52a731-93cf-4a4b-a491-b73c221bc21e"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.251140 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-config" (OuterVolumeSpecName: "config") pod "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" (UID: "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.251387 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-client-ca" (OuterVolumeSpecName: "client-ca") pod "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" (UID: "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.254512 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b52a731-93cf-4a4b-a491-b73c221bc21e-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1b52a731-93cf-4a4b-a491-b73c221bc21e" (UID: "1b52a731-93cf-4a4b-a491-b73c221bc21e"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.254781 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" (UID: "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.257023 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-kube-api-access-b9brg" (OuterVolumeSpecName: "kube-api-access-b9brg") pod "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" (UID: "f95f3b03-0f29-4683-b1eb-11ebc4d66b3f"). InnerVolumeSpecName "kube-api-access-b9brg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.261980 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b52a731-93cf-4a4b-a491-b73c221bc21e-kube-api-access-jzl92" (OuterVolumeSpecName: "kube-api-access-jzl92") pod "1b52a731-93cf-4a4b-a491-b73c221bc21e" (UID: "1b52a731-93cf-4a4b-a491-b73c221bc21e"). InnerVolumeSpecName "kube-api-access-jzl92". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.350557 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc9a46a8-533c-4717-88a5-466e196ef9f5-serving-cert\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.350623 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-config\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.350663 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-proxy-ca-bundles\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.350696 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-client-ca\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.351608 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mc4md\" (UniqueName: \"kubernetes.io/projected/bc9a46a8-533c-4717-88a5-466e196ef9f5-kube-api-access-mc4md\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.351656 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.351667 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.351676 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1b52a731-93cf-4a4b-a491-b73c221bc21e-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.351786 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352115 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jzl92\" (UniqueName: \"kubernetes.io/projected/1b52a731-93cf-4a4b-a491-b73c221bc21e-kube-api-access-jzl92\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352150 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352161 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352170 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1b52a731-93cf-4a4b-a491-b73c221bc21e-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352181 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b9brg\" (UniqueName: \"kubernetes.io/projected/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f-kube-api-access-b9brg\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352693 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-client-ca\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.352995 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-proxy-ca-bundles\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.353340 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-config\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.354591 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc9a46a8-533c-4717-88a5-466e196ef9f5-serving-cert\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.365901 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mc4md\" (UniqueName: \"kubernetes.io/projected/bc9a46a8-533c-4717-88a5-466e196ef9f5-kube-api-access-mc4md\") pod \"controller-manager-6b89f4dcb8-g4m8k\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.437118 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.479003 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq"] Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.481221 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq"] Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.488902 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8"] Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.491670 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-7d95bb6b4-mgtf8"] Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.506601 4772 patch_prober.go:28] interesting pod/route-controller-manager-6446bb59c7-xphgq container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: i/o timeout (Client.Timeout exceeded while awaiting headers)" start-of-body= Mar 20 10:59:19 crc kubenswrapper[4772]: I0320 10:59:19.506647 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6446bb59c7-xphgq" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.45:8443/healthz\": dial tcp 10.217.0.45:8443: i/o timeout (Client.Timeout exceeded while awaiting headers)" Mar 20 10:59:20 crc kubenswrapper[4772]: I0320 10:59:20.647532 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b52a731-93cf-4a4b-a491-b73c221bc21e" path="/var/lib/kubelet/pods/1b52a731-93cf-4a4b-a491-b73c221bc21e/volumes" Mar 20 10:59:20 crc kubenswrapper[4772]: I0320 10:59:20.648138 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f95f3b03-0f29-4683-b1eb-11ebc4d66b3f" path="/var/lib/kubelet/pods/f95f3b03-0f29-4683-b1eb-11ebc4d66b3f/volumes" Mar 20 10:59:21 crc kubenswrapper[4772]: E0320 10:59:21.039923 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Mar 20 10:59:21 crc kubenswrapper[4772]: E0320 10:59:21.040069 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-r4v48,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-pr4qj_openshift-marketplace(4eadb8ff-b747-4293-800f-b9894eb72ee3): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:21 crc kubenswrapper[4772]: E0320 10:59:21.041290 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-pr4qj" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.283078 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm"] Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.284206 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.284345 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm"] Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.287431 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.287686 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.287707 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.289081 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.289113 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.289162 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.381143 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff70a36-e66f-4aac-b0c0-b628727f67e0-serving-cert\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.381216 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-config\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.381248 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-client-ca\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.381358 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k59dd\" (UniqueName: \"kubernetes.io/projected/eff70a36-e66f-4aac-b0c0-b628727f67e0-kube-api-access-k59dd\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.482996 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff70a36-e66f-4aac-b0c0-b628727f67e0-serving-cert\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.483059 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-config\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.483075 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-client-ca\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.483136 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k59dd\" (UniqueName: \"kubernetes.io/projected/eff70a36-e66f-4aac-b0c0-b628727f67e0-kube-api-access-k59dd\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.484271 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-client-ca\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.484511 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-config\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.490336 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff70a36-e66f-4aac-b0c0-b628727f67e0-serving-cert\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.497971 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k59dd\" (UniqueName: \"kubernetes.io/projected/eff70a36-e66f-4aac-b0c0-b628727f67e0-kube-api-access-k59dd\") pod \"route-controller-manager-544f985986-bx4gm\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:21 crc kubenswrapper[4772]: I0320 10:59:21.620729 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:22 crc kubenswrapper[4772]: I0320 10:59:22.242357 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-rjcd2" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.280135 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k"] Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.370891 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm"] Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.666453 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.667469 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.671216 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.671258 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.678583 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:24 crc kubenswrapper[4772]: E0320 10:59:24.792643 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-pr4qj" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.827364 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76595817-e040-4808-a8b5-e46e81712f9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.827441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76595817-e040-4808-a8b5-e46e81712f9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.929282 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76595817-e040-4808-a8b5-e46e81712f9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.929367 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76595817-e040-4808-a8b5-e46e81712f9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.929480 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76595817-e040-4808-a8b5-e46e81712f9e-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.947170 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76595817-e040-4808-a8b5-e46e81712f9e-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:24 crc kubenswrapper[4772]: I0320 10:59:24.997107 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:26 crc kubenswrapper[4772]: E0320 10:59:26.458149 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:59:26 crc kubenswrapper[4772]: E0320 10:59:26.458348 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-fr9dx,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-8kc55_openshift-marketplace(22e23182-8e10-42d5-b34d-f09f6f280262): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:26 crc kubenswrapper[4772]: E0320 10:59:26.459542 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-8kc55" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" Mar 20 10:59:26 crc kubenswrapper[4772]: E0320 10:59:26.536801 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Mar 20 10:59:26 crc kubenswrapper[4772]: E0320 10:59:26.536996 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wnxsh,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-7mp8g_openshift-marketplace(3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:26 crc kubenswrapper[4772]: E0320 10:59:26.538192 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-7mp8g" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" Mar 20 10:59:27 crc kubenswrapper[4772]: I0320 10:59:27.191237 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.847525 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-7mp8g" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.847572 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-8kc55" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" Mar 20 10:59:27 crc kubenswrapper[4772]: I0320 10:59:27.904187 4772 scope.go:117] "RemoveContainer" containerID="6ac0f5820b4b8a91ff4e02d2b0c429104d06a144879a97466f4c8c7653e57a1e" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.931724 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.932020 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-gf9mw,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-f9wnx_openshift-marketplace(b1148455-d28e-4866-8b3e-cbabeaad84c7): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.934722 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-f9wnx" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.969717 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.969872 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-x6cg2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-pt8p5_openshift-marketplace(8db2c4ed-fcb2-48eb-a1a0-1be1d8613260): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Mar 20 10:59:27 crc kubenswrapper[4772]: E0320 10:59:27.971868 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-pt8p5" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.198616 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerStarted","Data":"5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc"} Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.206561 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerStarted","Data":"54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71"} Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.217700 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerStarted","Data":"8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de"} Mar 20 10:59:28 crc kubenswrapper[4772]: E0320 10:59:28.218162 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-f9wnx" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" Mar 20 10:59:28 crc kubenswrapper[4772]: E0320 10:59:28.220736 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-pt8p5" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.267788 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-m8kjd"] Mar 20 10:59:28 crc kubenswrapper[4772]: W0320 10:59:28.350177 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ac5550b_02eb_48b4_b62a_e21dd4429249.slice/crio-822dd136e5b1f678d2c9175e011585009bb1d3a3ff148836574c8cc0dcaab10c WatchSource:0}: Error finding container 822dd136e5b1f678d2c9175e011585009bb1d3a3ff148836574c8cc0dcaab10c: Status 404 returned error can't find the container with id 822dd136e5b1f678d2c9175e011585009bb1d3a3ff148836574c8cc0dcaab10c Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.374957 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.383757 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k"] Mar 20 10:59:28 crc kubenswrapper[4772]: I0320 10:59:28.499950 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm"] Mar 20 10:59:28 crc kubenswrapper[4772]: W0320 10:59:28.505551 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeff70a36_e66f_4aac_b0c0_b628727f67e0.slice/crio-69641f4c0e65968d0e4092fcc69f38ea53fed47c55f3692a53e0faab7817e34e WatchSource:0}: Error finding container 69641f4c0e65968d0e4092fcc69f38ea53fed47c55f3692a53e0faab7817e34e: Status 404 returned error can't find the container with id 69641f4c0e65968d0e4092fcc69f38ea53fed47c55f3692a53e0faab7817e34e Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.229172 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"76595817-e040-4808-a8b5-e46e81712f9e","Type":"ContainerStarted","Data":"1b9855e45460d355fa623d04dc9065c3004e632afcae305e541ccea59524d549"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.229539 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"76595817-e040-4808-a8b5-e46e81712f9e","Type":"ContainerStarted","Data":"2b9fd4dd7349adcf4e12149cfadb168736ad16e2297482fa7838a0652961c133"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.234630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" event={"ID":"2ac5550b-02eb-48b4-b62a-e21dd4429249","Type":"ContainerStarted","Data":"9445ea6c1a187e29648b2d632e8553dc704d70fb147deadb9d840a47b3ba903d"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.234671 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" event={"ID":"2ac5550b-02eb-48b4-b62a-e21dd4429249","Type":"ContainerStarted","Data":"822dd136e5b1f678d2c9175e011585009bb1d3a3ff148836574c8cc0dcaab10c"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.236821 4772 generic.go:334] "Generic (PLEG): container finished" podID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerID="5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc" exitCode=0 Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.236899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerDied","Data":"5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.241231 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" event={"ID":"bc9a46a8-533c-4717-88a5-466e196ef9f5","Type":"ContainerStarted","Data":"e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.241261 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" event={"ID":"bc9a46a8-533c-4717-88a5-466e196ef9f5","Type":"ContainerStarted","Data":"30b06cc6c473a00166d1d350c6d555823804ea477b67e1ef15457f7a85de2ab0"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.241355 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerName="controller-manager" containerID="cri-o://e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8" gracePeriod=30 Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.243684 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.248343 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-9-crc" podStartSLOduration=5.248321694 podStartE2EDuration="5.248321694s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:29.247181562 +0000 UTC m=+255.338148057" watchObservedRunningTime="2026-03-20 10:59:29.248321694 +0000 UTC m=+255.339288179" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.249625 4772 generic.go:334] "Generic (PLEG): container finished" podID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerID="54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71" exitCode=0 Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.249781 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerDied","Data":"54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.259299 4772 generic.go:334] "Generic (PLEG): container finished" podID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerID="8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de" exitCode=0 Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.259547 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerDied","Data":"8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.262485 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" event={"ID":"eff70a36-e66f-4aac-b0c0-b628727f67e0","Type":"ContainerStarted","Data":"9be59bd010af78dae2c4e37c858dcdbb0fd3a66989889807a130e850c1e3797b"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.262527 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" event={"ID":"eff70a36-e66f-4aac-b0c0-b628727f67e0","Type":"ContainerStarted","Data":"69641f4c0e65968d0e4092fcc69f38ea53fed47c55f3692a53e0faab7817e34e"} Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.262659 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" podUID="eff70a36-e66f-4aac-b0c0-b628727f67e0" containerName="route-controller-manager" containerID="cri-o://9be59bd010af78dae2c4e37c858dcdbb0fd3a66989889807a130e850c1e3797b" gracePeriod=30 Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.262758 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.268324 4772 patch_prober.go:28] interesting pod/controller-manager-6b89f4dcb8-g4m8k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:37410->10.217.0.57:8443: read: connection reset by peer" start-of-body= Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.268361 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": read tcp 10.217.0.2:37410->10.217.0.57:8443: read: connection reset by peer" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.291541 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" podStartSLOduration=25.291523028 podStartE2EDuration="25.291523028s" podCreationTimestamp="2026-03-20 10:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:29.286931489 +0000 UTC m=+255.377897974" watchObservedRunningTime="2026-03-20 10:59:29.291523028 +0000 UTC m=+255.382489513" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.327554 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" podStartSLOduration=25.327529439 podStartE2EDuration="25.327529439s" podCreationTimestamp="2026-03-20 10:59:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:29.323152096 +0000 UTC m=+255.414118581" watchObservedRunningTime="2026-03-20 10:59:29.327529439 +0000 UTC m=+255.418495924" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.406301 4772 patch_prober.go:28] interesting pod/route-controller-manager-544f985986-bx4gm container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:46980->10.217.0.58:8443: read: connection reset by peer" start-of-body= Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.406603 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" podUID="eff70a36-e66f-4aac-b0c0-b628727f67e0" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.58:8443/healthz\": read tcp 10.217.0.2:46980->10.217.0.58:8443: read: connection reset by peer" Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.437569 4772 patch_prober.go:28] interesting pod/controller-manager-6b89f4dcb8-g4m8k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" start-of-body= Mar 20 10:59:29 crc kubenswrapper[4772]: I0320 10:59:29.437615 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.57:8443/healthz\": dial tcp 10.217.0.57:8443: connect: connection refused" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.151480 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.184807 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6df7479c96-vblnv"] Mar 20 10:59:30 crc kubenswrapper[4772]: E0320 10:59:30.185154 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerName="controller-manager" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.185176 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerName="controller-manager" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.185270 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerName="controller-manager" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.185685 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.186338 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6df7479c96-vblnv"] Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.283738 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-544f985986-bx4gm_eff70a36-e66f-4aac-b0c0-b628727f67e0/route-controller-manager/0.log" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.283785 4772 generic.go:334] "Generic (PLEG): container finished" podID="eff70a36-e66f-4aac-b0c0-b628727f67e0" containerID="9be59bd010af78dae2c4e37c858dcdbb0fd3a66989889807a130e850c1e3797b" exitCode=255 Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.283866 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" event={"ID":"eff70a36-e66f-4aac-b0c0-b628727f67e0","Type":"ContainerDied","Data":"9be59bd010af78dae2c4e37c858dcdbb0fd3a66989889807a130e850c1e3797b"} Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.285376 4772 generic.go:334] "Generic (PLEG): container finished" podID="76595817-e040-4808-a8b5-e46e81712f9e" containerID="1b9855e45460d355fa623d04dc9065c3004e632afcae305e541ccea59524d549" exitCode=0 Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.285458 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"76595817-e040-4808-a8b5-e46e81712f9e","Type":"ContainerDied","Data":"1b9855e45460d355fa623d04dc9065c3004e632afcae305e541ccea59524d549"} Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.287556 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-m8kjd" event={"ID":"2ac5550b-02eb-48b4-b62a-e21dd4429249","Type":"ContainerStarted","Data":"36fcb6076590b45b9e39dd2a8ab3db58be09782819671e3be8bff40f5c30597f"} Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.289497 4772 generic.go:334] "Generic (PLEG): container finished" podID="bc9a46a8-533c-4717-88a5-466e196ef9f5" containerID="e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8" exitCode=0 Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.289613 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" event={"ID":"bc9a46a8-533c-4717-88a5-466e196ef9f5","Type":"ContainerDied","Data":"e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8"} Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.289618 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.289633 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k" event={"ID":"bc9a46a8-533c-4717-88a5-466e196ef9f5","Type":"ContainerDied","Data":"30b06cc6c473a00166d1d350c6d555823804ea477b67e1ef15457f7a85de2ab0"} Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.289652 4772 scope.go:117] "RemoveContainer" containerID="e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.295724 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-proxy-ca-bundles\") pod \"bc9a46a8-533c-4717-88a5-466e196ef9f5\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.295787 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-client-ca\") pod \"bc9a46a8-533c-4717-88a5-466e196ef9f5\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.295821 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mc4md\" (UniqueName: \"kubernetes.io/projected/bc9a46a8-533c-4717-88a5-466e196ef9f5-kube-api-access-mc4md\") pod \"bc9a46a8-533c-4717-88a5-466e196ef9f5\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.295877 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-config\") pod \"bc9a46a8-533c-4717-88a5-466e196ef9f5\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.295945 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc9a46a8-533c-4717-88a5-466e196ef9f5-serving-cert\") pod \"bc9a46a8-533c-4717-88a5-466e196ef9f5\" (UID: \"bc9a46a8-533c-4717-88a5-466e196ef9f5\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.296156 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-serving-cert\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.296182 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-proxy-ca-bundles\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.296217 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-config\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.296241 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jcjqh\" (UniqueName: \"kubernetes.io/projected/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-kube-api-access-jcjqh\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.296266 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-client-ca\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.297084 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-client-ca" (OuterVolumeSpecName: "client-ca") pod "bc9a46a8-533c-4717-88a5-466e196ef9f5" (UID: "bc9a46a8-533c-4717-88a5-466e196ef9f5"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.297348 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-config" (OuterVolumeSpecName: "config") pod "bc9a46a8-533c-4717-88a5-466e196ef9f5" (UID: "bc9a46a8-533c-4717-88a5-466e196ef9f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.298381 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "bc9a46a8-533c-4717-88a5-466e196ef9f5" (UID: "bc9a46a8-533c-4717-88a5-466e196ef9f5"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.305320 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc9a46a8-533c-4717-88a5-466e196ef9f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc9a46a8-533c-4717-88a5-466e196ef9f5" (UID: "bc9a46a8-533c-4717-88a5-466e196ef9f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.307609 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc9a46a8-533c-4717-88a5-466e196ef9f5-kube-api-access-mc4md" (OuterVolumeSpecName: "kube-api-access-mc4md") pod "bc9a46a8-533c-4717-88a5-466e196ef9f5" (UID: "bc9a46a8-533c-4717-88a5-466e196ef9f5"). InnerVolumeSpecName "kube-api-access-mc4md". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.323151 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-m8kjd" podStartSLOduration=213.323128449 podStartE2EDuration="3m33.323128449s" podCreationTimestamp="2026-03-20 10:55:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:30.315772881 +0000 UTC m=+256.406739386" watchObservedRunningTime="2026-03-20 10:59:30.323128449 +0000 UTC m=+256.414094924" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397059 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jcjqh\" (UniqueName: \"kubernetes.io/projected/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-kube-api-access-jcjqh\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397133 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-client-ca\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397194 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-serving-cert\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397214 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-proxy-ca-bundles\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397256 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-config\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397295 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397306 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397315 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mc4md\" (UniqueName: \"kubernetes.io/projected/bc9a46a8-533c-4717-88a5-466e196ef9f5-kube-api-access-mc4md\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397325 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bc9a46a8-533c-4717-88a5-466e196ef9f5-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.397334 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc9a46a8-533c-4717-88a5-466e196ef9f5-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.398275 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-client-ca\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.408625 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-proxy-ca-bundles\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.414993 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-serving-cert\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.415506 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-config\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.418438 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jcjqh\" (UniqueName: \"kubernetes.io/projected/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-kube-api-access-jcjqh\") pod \"controller-manager-6df7479c96-vblnv\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.423759 4772 scope.go:117] "RemoveContainer" containerID="e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8" Mar 20 10:59:30 crc kubenswrapper[4772]: E0320 10:59:30.424407 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8\": container with ID starting with e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8 not found: ID does not exist" containerID="e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.424547 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8"} err="failed to get container status \"e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8\": rpc error: code = NotFound desc = could not find container \"e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8\": container with ID starting with e64d11159316402af87147bfd011a2cbc85d6627273cf61da557db600d00e9b8 not found: ID does not exist" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.433095 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-544f985986-bx4gm_eff70a36-e66f-4aac-b0c0-b628727f67e0/route-controller-manager/0.log" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.433156 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.460786 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:30 crc kubenswrapper[4772]: E0320 10:59:30.461005 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="eff70a36-e66f-4aac-b0c0-b628727f67e0" containerName="route-controller-manager" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.461016 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="eff70a36-e66f-4aac-b0c0-b628727f67e0" containerName="route-controller-manager" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.461100 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="eff70a36-e66f-4aac-b0c0-b628727f67e0" containerName="route-controller-manager" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.461436 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.473418 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.513907 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.514052 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k59dd\" (UniqueName: \"kubernetes.io/projected/eff70a36-e66f-4aac-b0c0-b628727f67e0-kube-api-access-k59dd\") pod \"eff70a36-e66f-4aac-b0c0-b628727f67e0\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.514179 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-client-ca\") pod \"eff70a36-e66f-4aac-b0c0-b628727f67e0\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.514234 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff70a36-e66f-4aac-b0c0-b628727f67e0-serving-cert\") pod \"eff70a36-e66f-4aac-b0c0-b628727f67e0\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.514251 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-config\") pod \"eff70a36-e66f-4aac-b0c0-b628727f67e0\" (UID: \"eff70a36-e66f-4aac-b0c0-b628727f67e0\") " Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.514816 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-client-ca" (OuterVolumeSpecName: "client-ca") pod "eff70a36-e66f-4aac-b0c0-b628727f67e0" (UID: "eff70a36-e66f-4aac-b0c0-b628727f67e0"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.515350 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-config" (OuterVolumeSpecName: "config") pod "eff70a36-e66f-4aac-b0c0-b628727f67e0" (UID: "eff70a36-e66f-4aac-b0c0-b628727f67e0"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.518441 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eff70a36-e66f-4aac-b0c0-b628727f67e0-kube-api-access-k59dd" (OuterVolumeSpecName: "kube-api-access-k59dd") pod "eff70a36-e66f-4aac-b0c0-b628727f67e0" (UID: "eff70a36-e66f-4aac-b0c0-b628727f67e0"). InnerVolumeSpecName "kube-api-access-k59dd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.520400 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eff70a36-e66f-4aac-b0c0-b628727f67e0-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "eff70a36-e66f-4aac-b0c0-b628727f67e0" (UID: "eff70a36-e66f-4aac-b0c0-b628727f67e0"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.617327 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c939fe35-51ef-40a4-951c-cebac7f55e8c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.617590 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-var-lock\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.617853 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k"] Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.617909 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.618022 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.618059 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/eff70a36-e66f-4aac-b0c0-b628727f67e0-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.618084 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/eff70a36-e66f-4aac-b0c0-b628727f67e0-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.618109 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k59dd\" (UniqueName: \"kubernetes.io/projected/eff70a36-e66f-4aac-b0c0-b628727f67e0-kube-api-access-k59dd\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.620915 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6b89f4dcb8-g4m8k"] Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.648543 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc9a46a8-533c-4717-88a5-466e196ef9f5" path="/var/lib/kubelet/pods/bc9a46a8-533c-4717-88a5-466e196ef9f5/volumes" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.719653 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-var-lock\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.719747 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.719773 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c939fe35-51ef-40a4-951c-cebac7f55e8c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.720088 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-var-lock\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.720133 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-kubelet-dir\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.737363 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c939fe35-51ef-40a4-951c-cebac7f55e8c-kube-api-access\") pod \"installer-9-crc\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:30 crc kubenswrapper[4772]: I0320 10:59:30.824247 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.107496 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6df7479c96-vblnv"] Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.305011 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerStarted","Data":"1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39"} Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.313208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" event={"ID":"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6","Type":"ContainerStarted","Data":"ef0e5e5df7268c655c6d2f52869f5d1cc092ce9c081c5a43ef5c82594b40ad24"} Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.315925 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-route-controller-manager_route-controller-manager-544f985986-bx4gm_eff70a36-e66f-4aac-b0c0-b628727f67e0/route-controller-manager/0.log" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.315997 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" event={"ID":"eff70a36-e66f-4aac-b0c0-b628727f67e0","Type":"ContainerDied","Data":"69641f4c0e65968d0e4092fcc69f38ea53fed47c55f3692a53e0faab7817e34e"} Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.316090 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.316117 4772 scope.go:117] "RemoveContainer" containerID="9be59bd010af78dae2c4e37c858dcdbb0fd3a66989889807a130e850c1e3797b" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.324875 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qdlzf" podStartSLOduration=3.366165696 podStartE2EDuration="39.32485535s" podCreationTimestamp="2026-03-20 10:58:52 +0000 UTC" firstStartedPulling="2026-03-20 10:58:54.761036446 +0000 UTC m=+220.852002921" lastFinishedPulling="2026-03-20 10:59:30.71972609 +0000 UTC m=+256.810692575" observedRunningTime="2026-03-20 10:59:31.320695953 +0000 UTC m=+257.411662438" watchObservedRunningTime="2026-03-20 10:59:31.32485535 +0000 UTC m=+257.415821835" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.345375 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm"] Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.365988 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-544f985986-bx4gm"] Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.370757 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.559887 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.733290 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76595817-e040-4808-a8b5-e46e81712f9e-kube-api-access\") pod \"76595817-e040-4808-a8b5-e46e81712f9e\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.733430 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76595817-e040-4808-a8b5-e46e81712f9e-kubelet-dir\") pod \"76595817-e040-4808-a8b5-e46e81712f9e\" (UID: \"76595817-e040-4808-a8b5-e46e81712f9e\") " Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.733547 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/76595817-e040-4808-a8b5-e46e81712f9e-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "76595817-e040-4808-a8b5-e46e81712f9e" (UID: "76595817-e040-4808-a8b5-e46e81712f9e"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.733936 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/76595817-e040-4808-a8b5-e46e81712f9e-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.738962 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76595817-e040-4808-a8b5-e46e81712f9e-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "76595817-e040-4808-a8b5-e46e81712f9e" (UID: "76595817-e040-4808-a8b5-e46e81712f9e"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:31 crc kubenswrapper[4772]: I0320 10:59:31.837648 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/76595817-e040-4808-a8b5-e46e81712f9e-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.285019 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk"] Mar 20 10:59:32 crc kubenswrapper[4772]: E0320 10:59:32.285500 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="76595817-e040-4808-a8b5-e46e81712f9e" containerName="pruner" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.285512 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="76595817-e040-4808-a8b5-e46e81712f9e" containerName="pruner" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.285636 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="76595817-e040-4808-a8b5-e46e81712f9e" containerName="pruner" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.286004 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.288592 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.289046 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.289223 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.291748 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.291786 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.291818 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.294999 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk"] Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.322386 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c939fe35-51ef-40a4-951c-cebac7f55e8c","Type":"ContainerStarted","Data":"b3515e42162d3b9bc3278cd0aea4d678ca9228cbfc3c6d8c7af9af63e68c87a3"} Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.323943 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.324010 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"76595817-e040-4808-a8b5-e46e81712f9e","Type":"ContainerDied","Data":"2b9fd4dd7349adcf4e12149cfadb168736ad16e2297482fa7838a0652961c133"} Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.324064 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2b9fd4dd7349adcf4e12149cfadb168736ad16e2297482fa7838a0652961c133" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.346985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" event={"ID":"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6","Type":"ContainerStarted","Data":"267d4b480e0783d9a19615762f91a7ce37de86a2ab70ea2a4b22a41bed004291"} Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.347571 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.359204 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerStarted","Data":"0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d"} Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.368963 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.390256 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" podStartSLOduration=8.39023982 podStartE2EDuration="8.39023982s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:32.370163146 +0000 UTC m=+258.461129631" watchObservedRunningTime="2026-03-20 10:59:32.39023982 +0000 UTC m=+258.481206305" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.444632 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-client-ca\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.444687 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-config\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.444754 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bqds2\" (UniqueName: \"kubernetes.io/projected/d0286807-ea4d-4045-a910-19af09dc6647-kube-api-access-bqds2\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.444792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0286807-ea4d-4045-a910-19af09dc6647-serving-cert\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.545755 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-client-ca\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.545820 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-config\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.545892 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bqds2\" (UniqueName: \"kubernetes.io/projected/d0286807-ea4d-4045-a910-19af09dc6647-kube-api-access-bqds2\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.545935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0286807-ea4d-4045-a910-19af09dc6647-serving-cert\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.546644 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-client-ca\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.547243 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-config\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.550585 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0286807-ea4d-4045-a910-19af09dc6647-serving-cert\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.561942 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bqds2\" (UniqueName: \"kubernetes.io/projected/d0286807-ea4d-4045-a910-19af09dc6647-kube-api-access-bqds2\") pod \"route-controller-manager-6d9b58b6fc-wj2zk\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.603319 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.610584 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.610620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.650390 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eff70a36-e66f-4aac-b0c0-b628727f67e0" path="/var/lib/kubelet/pods/eff70a36-e66f-4aac-b0c0-b628727f67e0/volumes" Mar 20 10:59:32 crc kubenswrapper[4772]: I0320 10:59:32.889235 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk"] Mar 20 10:59:32 crc kubenswrapper[4772]: W0320 10:59:32.905050 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd0286807_ea4d_4045_a910_19af09dc6647.slice/crio-e8f78d48e81be337a78311acd994866bdfe90dc329d47732f47a9904cbc74096 WatchSource:0}: Error finding container e8f78d48e81be337a78311acd994866bdfe90dc329d47732f47a9904cbc74096: Status 404 returned error can't find the container with id e8f78d48e81be337a78311acd994866bdfe90dc329d47732f47a9904cbc74096 Mar 20 10:59:33 crc kubenswrapper[4772]: I0320 10:59:33.369326 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerStarted","Data":"24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a"} Mar 20 10:59:33 crc kubenswrapper[4772]: I0320 10:59:33.371117 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c939fe35-51ef-40a4-951c-cebac7f55e8c","Type":"ContainerStarted","Data":"be2f87e1ea431564579692e2018584c5890503fe0a5261841cb20b3b711870b2"} Mar 20 10:59:33 crc kubenswrapper[4772]: I0320 10:59:33.372341 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" event={"ID":"d0286807-ea4d-4045-a910-19af09dc6647","Type":"ContainerStarted","Data":"e8f78d48e81be337a78311acd994866bdfe90dc329d47732f47a9904cbc74096"} Mar 20 10:59:33 crc kubenswrapper[4772]: I0320 10:59:33.413996 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-6l8kp" podStartSLOduration=4.306531401 podStartE2EDuration="42.413979099s" podCreationTimestamp="2026-03-20 10:58:51 +0000 UTC" firstStartedPulling="2026-03-20 10:58:53.717677776 +0000 UTC m=+219.808644261" lastFinishedPulling="2026-03-20 10:59:31.825125474 +0000 UTC m=+257.916091959" observedRunningTime="2026-03-20 10:59:33.392440705 +0000 UTC m=+259.483407190" watchObservedRunningTime="2026-03-20 10:59:33.413979099 +0000 UTC m=+259.504945584" Mar 20 10:59:33 crc kubenswrapper[4772]: I0320 10:59:33.416654 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=3.416647434 podStartE2EDuration="3.416647434s" podCreationTimestamp="2026-03-20 10:59:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:33.409791932 +0000 UTC m=+259.500758417" watchObservedRunningTime="2026-03-20 10:59:33.416647434 +0000 UTC m=+259.507613919" Mar 20 10:59:33 crc kubenswrapper[4772]: I0320 10:59:33.437313 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-hrd6b" podStartSLOduration=4.972800329 podStartE2EDuration="42.437296375s" podCreationTimestamp="2026-03-20 10:58:51 +0000 UTC" firstStartedPulling="2026-03-20 10:58:53.722142761 +0000 UTC m=+219.813109246" lastFinishedPulling="2026-03-20 10:59:31.186638807 +0000 UTC m=+257.277605292" observedRunningTime="2026-03-20 10:59:33.434259029 +0000 UTC m=+259.525225514" watchObservedRunningTime="2026-03-20 10:59:33.437296375 +0000 UTC m=+259.528262870" Mar 20 10:59:34 crc kubenswrapper[4772]: I0320 10:59:34.113378 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qdlzf" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="registry-server" probeResult="failure" output=< Mar 20 10:59:34 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Mar 20 10:59:34 crc kubenswrapper[4772]: > Mar 20 10:59:34 crc kubenswrapper[4772]: I0320 10:59:34.378460 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" event={"ID":"d0286807-ea4d-4045-a910-19af09dc6647","Type":"ContainerStarted","Data":"1cbe9cc918ef2d32a4383d799a99201b1e74b494f1a8b42ccb3bb69dfc1cfe2e"} Mar 20 10:59:34 crc kubenswrapper[4772]: I0320 10:59:34.379259 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:34 crc kubenswrapper[4772]: I0320 10:59:34.385239 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:34 crc kubenswrapper[4772]: I0320 10:59:34.393679 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" podStartSLOduration=10.393654191 podStartE2EDuration="10.393654191s" podCreationTimestamp="2026-03-20 10:59:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:34.391157551 +0000 UTC m=+260.482124036" watchObservedRunningTime="2026-03-20 10:59:34.393654191 +0000 UTC m=+260.484620676" Mar 20 10:59:37 crc kubenswrapper[4772]: I0320 10:59:37.310990 4772 csr.go:261] certificate signing request csr-lt8c8 is approved, waiting to be issued Mar 20 10:59:37 crc kubenswrapper[4772]: I0320 10:59:37.317324 4772 csr.go:257] certificate signing request csr-lt8c8 is issued Mar 20 10:59:37 crc kubenswrapper[4772]: I0320 10:59:37.392689 4772 generic.go:334] "Generic (PLEG): container finished" podID="9309f110-5a80-46ca-b3de-8087048c13e2" containerID="0b1b0f547474a86220ba01b4225bdb9e4b8d5cc0ed8f6fb441e918b9447359dc" exitCode=0 Mar 20 10:59:37 crc kubenswrapper[4772]: I0320 10:59:37.392736 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" event={"ID":"9309f110-5a80-46ca-b3de-8087048c13e2","Type":"ContainerDied","Data":"0b1b0f547474a86220ba01b4225bdb9e4b8d5cc0ed8f6fb441e918b9447359dc"} Mar 20 10:59:38 crc kubenswrapper[4772]: I0320 10:59:38.318722 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-02-24 05:54:36 +0000 UTC, rotation deadline is 2026-11-23 00:51:45.761871749 +0000 UTC Mar 20 10:59:38 crc kubenswrapper[4772]: I0320 10:59:38.318757 4772 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 5941h52m7.44311681s for next certificate rotation Mar 20 10:59:38 crc kubenswrapper[4772]: I0320 10:59:38.728972 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:59:38 crc kubenswrapper[4772]: I0320 10:59:38.846001 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d66cw\" (UniqueName: \"kubernetes.io/projected/9309f110-5a80-46ca-b3de-8087048c13e2-kube-api-access-d66cw\") pod \"9309f110-5a80-46ca-b3de-8087048c13e2\" (UID: \"9309f110-5a80-46ca-b3de-8087048c13e2\") " Mar 20 10:59:38 crc kubenswrapper[4772]: I0320 10:59:38.855778 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9309f110-5a80-46ca-b3de-8087048c13e2-kube-api-access-d66cw" (OuterVolumeSpecName: "kube-api-access-d66cw") pod "9309f110-5a80-46ca-b3de-8087048c13e2" (UID: "9309f110-5a80-46ca-b3de-8087048c13e2"). InnerVolumeSpecName "kube-api-access-d66cw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:38 crc kubenswrapper[4772]: I0320 10:59:38.948195 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d66cw\" (UniqueName: \"kubernetes.io/projected/9309f110-5a80-46ca-b3de-8087048c13e2-kube-api-access-d66cw\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.403580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" event={"ID":"9309f110-5a80-46ca-b3de-8087048c13e2","Type":"ContainerDied","Data":"2d223b9fe83b8af1dcec02c6a9a74e9e64b68b3cffcd9f9698bc74fe2534c11d"} Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.404253 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2d223b9fe83b8af1dcec02c6a9a74e9e64b68b3cffcd9f9698bc74fe2534c11d" Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.404373 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566738-zlqcq" Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.564963 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.565206 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.565302 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.565774 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 10:59:39 crc kubenswrapper[4772]: I0320 10:59:39.565919 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e" gracePeriod=600 Mar 20 10:59:40 crc kubenswrapper[4772]: I0320 10:59:40.409092 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e" exitCode=0 Mar 20 10:59:40 crc kubenswrapper[4772]: I0320 10:59:40.409265 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e"} Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.428296 4772 generic.go:334] "Generic (PLEG): container finished" podID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerID="c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b" exitCode=0 Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.428333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr4qj" event={"ID":"4eadb8ff-b747-4293-800f-b9894eb72ee3","Type":"ContainerDied","Data":"c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b"} Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.435073 4772 generic.go:334] "Generic (PLEG): container finished" podID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerID="18852cff0b41a24899b69dacbe1f263b3676ce4ba6ee2f1d4d3b6291bb9f1dbe" exitCode=0 Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.435155 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mp8g" event={"ID":"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe","Type":"ContainerDied","Data":"18852cff0b41a24899b69dacbe1f263b3676ce4ba6ee2f1d4d3b6291bb9f1dbe"} Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.438164 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"c7d6b41b9d4ea0c87e4e55d0473f2ee78f694c9c978233327bfd7e8f2cecafdc"} Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.598372 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.599019 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:59:41 crc kubenswrapper[4772]: I0320 10:59:41.645447 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.214866 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.215263 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.263721 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.446463 4772 generic.go:334] "Generic (PLEG): container finished" podID="22e23182-8e10-42d5-b34d-f09f6f280262" containerID="237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5" exitCode=0 Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.446531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kc55" event={"ID":"22e23182-8e10-42d5-b34d-f09f6f280262","Type":"ContainerDied","Data":"237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5"} Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.449039 4772 generic.go:334] "Generic (PLEG): container finished" podID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerID="410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de" exitCode=0 Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.449164 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt8p5" event={"ID":"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260","Type":"ContainerDied","Data":"410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de"} Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.499675 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.508727 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.657978 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:59:42 crc kubenswrapper[4772]: I0320 10:59:42.705110 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.457557 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kc55" event={"ID":"22e23182-8e10-42d5-b34d-f09f6f280262","Type":"ContainerStarted","Data":"8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0"} Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.460004 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt8p5" event={"ID":"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260","Type":"ContainerStarted","Data":"6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb"} Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.462238 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr4qj" event={"ID":"4eadb8ff-b747-4293-800f-b9894eb72ee3","Type":"ContainerStarted","Data":"1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713"} Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.464062 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerStarted","Data":"2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661"} Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.466169 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mp8g" event={"ID":"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe","Type":"ContainerStarted","Data":"678dabe97912403cd82a3bdc8d03440e424463cba065b6f560cf54eee5c29001"} Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.477030 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-8kc55" podStartSLOduration=2.869527464 podStartE2EDuration="55.477012889s" podCreationTimestamp="2026-03-20 10:58:48 +0000 UTC" firstStartedPulling="2026-03-20 10:58:50.512405022 +0000 UTC m=+216.603371507" lastFinishedPulling="2026-03-20 10:59:43.119890447 +0000 UTC m=+269.210856932" observedRunningTime="2026-03-20 10:59:43.473865351 +0000 UTC m=+269.564831856" watchObservedRunningTime="2026-03-20 10:59:43.477012889 +0000 UTC m=+269.567979384" Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.496935 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7mp8g" podStartSLOduration=3.420132463 podStartE2EDuration="54.496912738s" podCreationTimestamp="2026-03-20 10:58:49 +0000 UTC" firstStartedPulling="2026-03-20 10:58:51.586478054 +0000 UTC m=+217.677444539" lastFinishedPulling="2026-03-20 10:59:42.663258329 +0000 UTC m=+268.754224814" observedRunningTime="2026-03-20 10:59:43.493196994 +0000 UTC m=+269.584163479" watchObservedRunningTime="2026-03-20 10:59:43.496912738 +0000 UTC m=+269.587879223" Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.514331 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-pt8p5" podStartSLOduration=3.093805614 podStartE2EDuration="55.514310187s" podCreationTimestamp="2026-03-20 10:58:48 +0000 UTC" firstStartedPulling="2026-03-20 10:58:50.526278781 +0000 UTC m=+216.617245266" lastFinishedPulling="2026-03-20 10:59:42.946783354 +0000 UTC m=+269.037749839" observedRunningTime="2026-03-20 10:59:43.510795208 +0000 UTC m=+269.601761703" watchObservedRunningTime="2026-03-20 10:59:43.514310187 +0000 UTC m=+269.605276672" Mar 20 10:59:43 crc kubenswrapper[4772]: I0320 10:59:43.541916 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-pr4qj" podStartSLOduration=3.398133779 podStartE2EDuration="53.541895743s" podCreationTimestamp="2026-03-20 10:58:50 +0000 UTC" firstStartedPulling="2026-03-20 10:58:52.619365131 +0000 UTC m=+218.710331616" lastFinishedPulling="2026-03-20 10:59:42.763127095 +0000 UTC m=+268.854093580" observedRunningTime="2026-03-20 10:59:43.540906874 +0000 UTC m=+269.631873359" watchObservedRunningTime="2026-03-20 10:59:43.541895743 +0000 UTC m=+269.632862218" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.315651 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6df7479c96-vblnv"] Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.315859 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" podUID="6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" containerName="controller-manager" containerID="cri-o://267d4b480e0783d9a19615762f91a7ce37de86a2ab70ea2a4b22a41bed004291" gracePeriod=30 Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.327820 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk"] Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.328045 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" podUID="d0286807-ea4d-4045-a910-19af09dc6647" containerName="route-controller-manager" containerID="cri-o://1cbe9cc918ef2d32a4383d799a99201b1e74b494f1a8b42ccb3bb69dfc1cfe2e" gracePeriod=30 Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.475440 4772 generic.go:334] "Generic (PLEG): container finished" podID="d0286807-ea4d-4045-a910-19af09dc6647" containerID="1cbe9cc918ef2d32a4383d799a99201b1e74b494f1a8b42ccb3bb69dfc1cfe2e" exitCode=0 Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.475500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" event={"ID":"d0286807-ea4d-4045-a910-19af09dc6647","Type":"ContainerDied","Data":"1cbe9cc918ef2d32a4383d799a99201b1e74b494f1a8b42ccb3bb69dfc1cfe2e"} Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.477226 4772 generic.go:334] "Generic (PLEG): container finished" podID="6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" containerID="267d4b480e0783d9a19615762f91a7ce37de86a2ab70ea2a4b22a41bed004291" exitCode=0 Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.477261 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" event={"ID":"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6","Type":"ContainerDied","Data":"267d4b480e0783d9a19615762f91a7ce37de86a2ab70ea2a4b22a41bed004291"} Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.480089 4772 generic.go:334] "Generic (PLEG): container finished" podID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerID="2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661" exitCode=0 Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.480114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerDied","Data":"2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661"} Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.834179 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.914223 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.938406 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bqds2\" (UniqueName: \"kubernetes.io/projected/d0286807-ea4d-4045-a910-19af09dc6647-kube-api-access-bqds2\") pod \"d0286807-ea4d-4045-a910-19af09dc6647\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.938589 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-config\") pod \"d0286807-ea4d-4045-a910-19af09dc6647\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.939566 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-config" (OuterVolumeSpecName: "config") pod "d0286807-ea4d-4045-a910-19af09dc6647" (UID: "d0286807-ea4d-4045-a910-19af09dc6647"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.939798 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0286807-ea4d-4045-a910-19af09dc6647-serving-cert\") pod \"d0286807-ea4d-4045-a910-19af09dc6647\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.939834 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-client-ca\") pod \"d0286807-ea4d-4045-a910-19af09dc6647\" (UID: \"d0286807-ea4d-4045-a910-19af09dc6647\") " Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.940144 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.940458 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-client-ca" (OuterVolumeSpecName: "client-ca") pod "d0286807-ea4d-4045-a910-19af09dc6647" (UID: "d0286807-ea4d-4045-a910-19af09dc6647"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.945729 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d0286807-ea4d-4045-a910-19af09dc6647-kube-api-access-bqds2" (OuterVolumeSpecName: "kube-api-access-bqds2") pod "d0286807-ea4d-4045-a910-19af09dc6647" (UID: "d0286807-ea4d-4045-a910-19af09dc6647"). InnerVolumeSpecName "kube-api-access-bqds2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:44 crc kubenswrapper[4772]: I0320 10:59:44.952770 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d0286807-ea4d-4045-a910-19af09dc6647-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "d0286807-ea4d-4045-a910-19af09dc6647" (UID: "d0286807-ea4d-4045-a910-19af09dc6647"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041548 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-config\") pod \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041624 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-proxy-ca-bundles\") pod \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041663 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-client-ca\") pod \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041679 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-serving-cert\") pod \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041714 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jcjqh\" (UniqueName: \"kubernetes.io/projected/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-kube-api-access-jcjqh\") pod \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\" (UID: \"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6\") " Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041861 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d0286807-ea4d-4045-a910-19af09dc6647-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041871 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/d0286807-ea4d-4045-a910-19af09dc6647-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.041880 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bqds2\" (UniqueName: \"kubernetes.io/projected/d0286807-ea4d-4045-a910-19af09dc6647-kube-api-access-bqds2\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.042714 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-client-ca" (OuterVolumeSpecName: "client-ca") pod "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" (UID: "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.042740 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-config" (OuterVolumeSpecName: "config") pod "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" (UID: "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.042985 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" (UID: "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.044571 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" (UID: "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.048043 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-kube-api-access-jcjqh" (OuterVolumeSpecName: "kube-api-access-jcjqh") pod "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" (UID: "6ec3cc4e-393d-4aff-b59b-d47c1bb847c6"). InnerVolumeSpecName "kube-api-access-jcjqh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.143579 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.143809 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.143825 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jcjqh\" (UniqueName: \"kubernetes.io/projected/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-kube-api-access-jcjqh\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.143856 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-config\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.143871 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.488319 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" event={"ID":"6ec3cc4e-393d-4aff-b59b-d47c1bb847c6","Type":"ContainerDied","Data":"ef0e5e5df7268c655c6d2f52869f5d1cc092ce9c081c5a43ef5c82594b40ad24"} Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.488527 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6df7479c96-vblnv" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.489337 4772 scope.go:117] "RemoveContainer" containerID="267d4b480e0783d9a19615762f91a7ce37de86a2ab70ea2a4b22a41bed004291" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.492573 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerStarted","Data":"fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7"} Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.493928 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" event={"ID":"d0286807-ea4d-4045-a910-19af09dc6647","Type":"ContainerDied","Data":"e8f78d48e81be337a78311acd994866bdfe90dc329d47732f47a9904cbc74096"} Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.493952 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.517146 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-f9wnx" podStartSLOduration=2.070436307 podStartE2EDuration="56.517126262s" podCreationTimestamp="2026-03-20 10:58:49 +0000 UTC" firstStartedPulling="2026-03-20 10:58:50.493576162 +0000 UTC m=+216.584542647" lastFinishedPulling="2026-03-20 10:59:44.940266127 +0000 UTC m=+271.031232602" observedRunningTime="2026-03-20 10:59:45.511410552 +0000 UTC m=+271.602377047" watchObservedRunningTime="2026-03-20 10:59:45.517126262 +0000 UTC m=+271.608092757" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.525420 4772 scope.go:117] "RemoveContainer" containerID="1cbe9cc918ef2d32a4383d799a99201b1e74b494f1a8b42ccb3bb69dfc1cfe2e" Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.533908 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk"] Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.536709 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6d9b58b6fc-wj2zk"] Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.548917 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6df7479c96-vblnv"] Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.553498 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6df7479c96-vblnv"] Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.869801 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrd6b"] Mar 20 10:59:45 crc kubenswrapper[4772]: I0320 10:59:45.870113 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-hrd6b" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="registry-server" containerID="cri-o://0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d" gracePeriod=2 Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.072402 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qdlzf"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.072750 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qdlzf" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="registry-server" containerID="cri-o://1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39" gracePeriod=2 Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.284658 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293355 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-bfc585969-cppp2"] Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.293599 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293614 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.293625 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="extract-content" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293631 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="extract-content" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.293641 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9309f110-5a80-46ca-b3de-8087048c13e2" containerName="oc" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293647 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="9309f110-5a80-46ca-b3de-8087048c13e2" containerName="oc" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.293659 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="registry-server" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293666 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="registry-server" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.293679 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="extract-utilities" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293685 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="extract-utilities" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.293695 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d0286807-ea4d-4045-a910-19af09dc6647" containerName="route-controller-manager" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293701 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d0286807-ea4d-4045-a910-19af09dc6647" containerName="route-controller-manager" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293785 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" containerName="controller-manager" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293795 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerName="registry-server" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293803 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="9309f110-5a80-46ca-b3de-8087048c13e2" containerName="oc" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.293813 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d0286807-ea4d-4045-a910-19af09dc6647" containerName="route-controller-manager" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.294199 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.303062 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.304522 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.305222 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.306316 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.308549 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.308794 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.316640 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.318113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.332380 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.332591 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.332729 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.333189 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.335974 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.336396 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.336539 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.336649 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.341674 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bfc585969-cppp2"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.361444 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-client-ca\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.381109 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6073a773-4b7a-4edf-a1d2-f559c26abc9c-serving-cert\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.381425 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-config\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.381576 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-client-ca\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.381711 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824f2af7-476f-4b4e-96c5-4dcdcd159130-serving-cert\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.381965 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s6mwj\" (UniqueName: \"kubernetes.io/projected/824f2af7-476f-4b4e-96c5-4dcdcd159130-kube-api-access-s6mwj\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.382118 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-config\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.382291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-proxy-ca-bundles\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.382448 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lwpkk\" (UniqueName: \"kubernetes.io/projected/6073a773-4b7a-4edf-a1d2-f559c26abc9c-kube-api-access-lwpkk\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.448799 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.483676 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-utilities\") pod \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.483740 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czr5h\" (UniqueName: \"kubernetes.io/projected/04199621-c96a-4c6e-b7c0-3559112cc4fc-kube-api-access-czr5h\") pod \"04199621-c96a-4c6e-b7c0-3559112cc4fc\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.483766 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-utilities\") pod \"04199621-c96a-4c6e-b7c0-3559112cc4fc\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.483810 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-catalog-content\") pod \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.483853 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-catalog-content\") pod \"04199621-c96a-4c6e-b7c0-3559112cc4fc\" (UID: \"04199621-c96a-4c6e-b7c0-3559112cc4fc\") " Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.483885 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpgpg\" (UniqueName: \"kubernetes.io/projected/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-kube-api-access-hpgpg\") pod \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\" (UID: \"38661b1d-4edd-438e-b69b-6e9f9c8a7d65\") " Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484037 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-proxy-ca-bundles\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484096 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lwpkk\" (UniqueName: \"kubernetes.io/projected/6073a773-4b7a-4edf-a1d2-f559c26abc9c-kube-api-access-lwpkk\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484145 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-client-ca\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484190 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6073a773-4b7a-4edf-a1d2-f559c26abc9c-serving-cert\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484217 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-config\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484241 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-client-ca\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484264 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824f2af7-476f-4b4e-96c5-4dcdcd159130-serving-cert\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484286 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s6mwj\" (UniqueName: \"kubernetes.io/projected/824f2af7-476f-4b4e-96c5-4dcdcd159130-kube-api-access-s6mwj\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484314 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-config\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484914 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-utilities" (OuterVolumeSpecName: "utilities") pod "38661b1d-4edd-438e-b69b-6e9f9c8a7d65" (UID: "38661b1d-4edd-438e-b69b-6e9f9c8a7d65"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.484962 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-utilities" (OuterVolumeSpecName: "utilities") pod "04199621-c96a-4c6e-b7c0-3559112cc4fc" (UID: "04199621-c96a-4c6e-b7c0-3559112cc4fc"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.485686 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-config\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.485980 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-client-ca\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.486416 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-client-ca\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.487670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-proxy-ca-bundles\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.489997 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/04199621-c96a-4c6e-b7c0-3559112cc4fc-kube-api-access-czr5h" (OuterVolumeSpecName: "kube-api-access-czr5h") pod "04199621-c96a-4c6e-b7c0-3559112cc4fc" (UID: "04199621-c96a-4c6e-b7c0-3559112cc4fc"). InnerVolumeSpecName "kube-api-access-czr5h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.491411 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-kube-api-access-hpgpg" (OuterVolumeSpecName: "kube-api-access-hpgpg") pod "38661b1d-4edd-438e-b69b-6e9f9c8a7d65" (UID: "38661b1d-4edd-438e-b69b-6e9f9c8a7d65"). InnerVolumeSpecName "kube-api-access-hpgpg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.494461 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824f2af7-476f-4b4e-96c5-4dcdcd159130-serving-cert\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.506884 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-config\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.516302 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6073a773-4b7a-4edf-a1d2-f559c26abc9c-serving-cert\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.523779 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s6mwj\" (UniqueName: \"kubernetes.io/projected/824f2af7-476f-4b4e-96c5-4dcdcd159130-kube-api-access-s6mwj\") pod \"controller-manager-bfc585969-cppp2\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.524696 4772 generic.go:334] "Generic (PLEG): container finished" podID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerID="1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39" exitCode=0 Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.524761 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qdlzf" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.524762 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerDied","Data":"1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39"} Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.524990 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qdlzf" event={"ID":"38661b1d-4edd-438e-b69b-6e9f9c8a7d65","Type":"ContainerDied","Data":"002c321de43dff0295934eb0ec3482f04c29b4e9c635a8567d0e5bac56a8d3fa"} Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.525011 4772 scope.go:117] "RemoveContainer" containerID="1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.528122 4772 generic.go:334] "Generic (PLEG): container finished" podID="04199621-c96a-4c6e-b7c0-3559112cc4fc" containerID="0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d" exitCode=0 Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.528183 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerDied","Data":"0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d"} Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.528211 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-hrd6b" event={"ID":"04199621-c96a-4c6e-b7c0-3559112cc4fc","Type":"ContainerDied","Data":"b2f4a285ef424ac7f8e0a116f7a0546f4d58d36738d88d8c4e851e79b240b085"} Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.528232 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-hrd6b" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.528505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lwpkk\" (UniqueName: \"kubernetes.io/projected/6073a773-4b7a-4edf-a1d2-f559c26abc9c-kube-api-access-lwpkk\") pod \"route-controller-manager-7dd98b5c79-wb9t6\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.542549 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "04199621-c96a-4c6e-b7c0-3559112cc4fc" (UID: "04199621-c96a-4c6e-b7c0-3559112cc4fc"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.557121 4772 scope.go:117] "RemoveContainer" containerID="5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.580035 4772 scope.go:117] "RemoveContainer" containerID="fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.585166 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.585192 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpgpg\" (UniqueName: \"kubernetes.io/projected/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-kube-api-access-hpgpg\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.585202 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.585211 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czr5h\" (UniqueName: \"kubernetes.io/projected/04199621-c96a-4c6e-b7c0-3559112cc4fc-kube-api-access-czr5h\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.585221 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/04199621-c96a-4c6e-b7c0-3559112cc4fc-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.590921 4772 scope.go:117] "RemoveContainer" containerID="1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.591816 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39\": container with ID starting with 1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39 not found: ID does not exist" containerID="1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.591917 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39"} err="failed to get container status \"1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39\": rpc error: code = NotFound desc = could not find container \"1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39\": container with ID starting with 1843eef7f3e4e55670e7f7d2af47d18a3b6df845acfd7147a6e5a786ab050b39 not found: ID does not exist" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.591945 4772 scope.go:117] "RemoveContainer" containerID="5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.592349 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc\": container with ID starting with 5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc not found: ID does not exist" containerID="5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.592385 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc"} err="failed to get container status \"5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc\": rpc error: code = NotFound desc = could not find container \"5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc\": container with ID starting with 5670219cf6aad8b01b2b58d0f1dd92f85380b679394e2827fc83d666d19b08dc not found: ID does not exist" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.592411 4772 scope.go:117] "RemoveContainer" containerID="fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.592642 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e\": container with ID starting with fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e not found: ID does not exist" containerID="fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.592664 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e"} err="failed to get container status \"fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e\": rpc error: code = NotFound desc = could not find container \"fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e\": container with ID starting with fceedb26eba475870e6510019a8b0e1080b68c2452dcee44d73fac8ebf97373e not found: ID does not exist" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.592677 4772 scope.go:117] "RemoveContainer" containerID="0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.615287 4772 scope.go:117] "RemoveContainer" containerID="8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.636299 4772 scope.go:117] "RemoveContainer" containerID="c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.640026 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.646071 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.649357 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ec3cc4e-393d-4aff-b59b-d47c1bb847c6" path="/var/lib/kubelet/pods/6ec3cc4e-393d-4aff-b59b-d47c1bb847c6/volumes" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.650458 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d0286807-ea4d-4045-a910-19af09dc6647" path="/var/lib/kubelet/pods/d0286807-ea4d-4045-a910-19af09dc6647/volumes" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.660880 4772 scope.go:117] "RemoveContainer" containerID="0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.661229 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d\": container with ID starting with 0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d not found: ID does not exist" containerID="0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.661262 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d"} err="failed to get container status \"0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d\": rpc error: code = NotFound desc = could not find container \"0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d\": container with ID starting with 0f1c2a6e4be6db407c2ea7ad5fde9e4a2d2d1565e5128bbcbff4da0db111715d not found: ID does not exist" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.661292 4772 scope.go:117] "RemoveContainer" containerID="8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.662881 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de\": container with ID starting with 8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de not found: ID does not exist" containerID="8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.662909 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de"} err="failed to get container status \"8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de\": rpc error: code = NotFound desc = could not find container \"8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de\": container with ID starting with 8fc210c65631b650eb0151ccb102b38aa431c801eab1fc0efdeb441addf933de not found: ID does not exist" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.662925 4772 scope.go:117] "RemoveContainer" containerID="c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386" Mar 20 10:59:46 crc kubenswrapper[4772]: E0320 10:59:46.663270 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386\": container with ID starting with c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386 not found: ID does not exist" containerID="c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.663298 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386"} err="failed to get container status \"c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386\": rpc error: code = NotFound desc = could not find container \"c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386\": container with ID starting with c98cb487fd5e4cc70a2374c5b1d57cc12ceea4c1e0321745a83f41a0af7e3386 not found: ID does not exist" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.669815 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "38661b1d-4edd-438e-b69b-6e9f9c8a7d65" (UID: "38661b1d-4edd-438e-b69b-6e9f9c8a7d65"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.686119 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/38661b1d-4edd-438e-b69b-6e9f9c8a7d65-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.855913 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrd6b"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.872895 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-hrd6b"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.873899 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qdlzf"] Mar 20 10:59:46 crc kubenswrapper[4772]: I0320 10:59:46.875744 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qdlzf"] Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.113228 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6"] Mar 20 10:59:47 crc kubenswrapper[4772]: W0320 10:59:47.126504 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6073a773_4b7a_4edf_a1d2_f559c26abc9c.slice/crio-78a9193284a5760eda3d6f05c14ac605ee64ecd07f454cb6cc30dc68f61195e8 WatchSource:0}: Error finding container 78a9193284a5760eda3d6f05c14ac605ee64ecd07f454cb6cc30dc68f61195e8: Status 404 returned error can't find the container with id 78a9193284a5760eda3d6f05c14ac605ee64ecd07f454cb6cc30dc68f61195e8 Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.159692 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-bfc585969-cppp2"] Mar 20 10:59:47 crc kubenswrapper[4772]: W0320 10:59:47.164204 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod824f2af7_476f_4b4e_96c5_4dcdcd159130.slice/crio-e50a0144842e1949fabed49bcdb8571281469571ebe7b756baf686a87b7e6e7e WatchSource:0}: Error finding container e50a0144842e1949fabed49bcdb8571281469571ebe7b756baf686a87b7e6e7e: Status 404 returned error can't find the container with id e50a0144842e1949fabed49bcdb8571281469571ebe7b756baf686a87b7e6e7e Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.545527 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" event={"ID":"824f2af7-476f-4b4e-96c5-4dcdcd159130","Type":"ContainerStarted","Data":"75f29844894ee7c0e5e286ce1c2b84b0e96eb085f06f4e1c5349fb67cab00501"} Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.545570 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" event={"ID":"824f2af7-476f-4b4e-96c5-4dcdcd159130","Type":"ContainerStarted","Data":"e50a0144842e1949fabed49bcdb8571281469571ebe7b756baf686a87b7e6e7e"} Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.545931 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.562486 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.563146 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" event={"ID":"6073a773-4b7a-4edf-a1d2-f559c26abc9c","Type":"ContainerStarted","Data":"ece4f402a2093eb0df360f9f13102499075a4224a8599f5bcde2bb034a793ea8"} Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.563205 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" event={"ID":"6073a773-4b7a-4edf-a1d2-f559c26abc9c","Type":"ContainerStarted","Data":"78a9193284a5760eda3d6f05c14ac605ee64ecd07f454cb6cc30dc68f61195e8"} Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.563598 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.578608 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" podStartSLOduration=3.578588634 podStartE2EDuration="3.578588634s" podCreationTimestamp="2026-03-20 10:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:47.574439928 +0000 UTC m=+273.665406423" watchObservedRunningTime="2026-03-20 10:59:47.578588634 +0000 UTC m=+273.669555129" Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.597122 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" podStartSLOduration=3.597099714 podStartE2EDuration="3.597099714s" podCreationTimestamp="2026-03-20 10:59:44 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 10:59:47.593946896 +0000 UTC m=+273.684913401" watchObservedRunningTime="2026-03-20 10:59:47.597099714 +0000 UTC m=+273.688066199" Mar 20 10:59:47 crc kubenswrapper[4772]: I0320 10:59:47.979064 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 10:59:48 crc kubenswrapper[4772]: I0320 10:59:48.648876 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="04199621-c96a-4c6e-b7c0-3559112cc4fc" path="/var/lib/kubelet/pods/04199621-c96a-4c6e-b7c0-3559112cc4fc/volumes" Mar 20 10:59:48 crc kubenswrapper[4772]: I0320 10:59:48.650274 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" path="/var/lib/kubelet/pods/38661b1d-4edd-438e-b69b-6e9f9c8a7d65/volumes" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.095699 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.095805 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.152670 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.190557 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.190620 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.255058 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.412099 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.412144 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.464683 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.633007 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.642190 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-8kc55" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.677184 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.677245 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:59:49 crc kubenswrapper[4772]: I0320 10:59:49.734552 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:59:50 crc kubenswrapper[4772]: I0320 10:59:50.637436 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:59:51 crc kubenswrapper[4772]: I0320 10:59:51.179465 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:59:51 crc kubenswrapper[4772]: I0320 10:59:51.180389 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:59:51 crc kubenswrapper[4772]: I0320 10:59:51.247714 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:59:51 crc kubenswrapper[4772]: I0320 10:59:51.648717 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 10:59:52 crc kubenswrapper[4772]: I0320 10:59:52.477732 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mp8g"] Mar 20 10:59:52 crc kubenswrapper[4772]: I0320 10:59:52.597754 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-7mp8g" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="registry-server" containerID="cri-o://678dabe97912403cd82a3bdc8d03440e424463cba065b6f560cf54eee5c29001" gracePeriod=2 Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.606928 4772 generic.go:334] "Generic (PLEG): container finished" podID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerID="678dabe97912403cd82a3bdc8d03440e424463cba065b6f560cf54eee5c29001" exitCode=0 Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.607824 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mp8g" event={"ID":"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe","Type":"ContainerDied","Data":"678dabe97912403cd82a3bdc8d03440e424463cba065b6f560cf54eee5c29001"} Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.801275 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.893631 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wnxsh\" (UniqueName: \"kubernetes.io/projected/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-kube-api-access-wnxsh\") pod \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.893711 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-utilities\") pod \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.893763 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-catalog-content\") pod \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\" (UID: \"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe\") " Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.894590 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-utilities" (OuterVolumeSpecName: "utilities") pod "3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" (UID: "3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.901050 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-kube-api-access-wnxsh" (OuterVolumeSpecName: "kube-api-access-wnxsh") pod "3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" (UID: "3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe"). InnerVolumeSpecName "kube-api-access-wnxsh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.955636 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" (UID: "3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.994603 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.994639 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wnxsh\" (UniqueName: \"kubernetes.io/projected/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-kube-api-access-wnxsh\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:53 crc kubenswrapper[4772]: I0320 10:59:53.994653 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.614741 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7mp8g" event={"ID":"3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe","Type":"ContainerDied","Data":"fec2d7bddc8a1f16f63a769dc043d245a99c5c5253dd8211d4e78f396fc50b66"} Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.614826 4772 scope.go:117] "RemoveContainer" containerID="678dabe97912403cd82a3bdc8d03440e424463cba065b6f560cf54eee5c29001" Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.615312 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7mp8g" Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.641032 4772 scope.go:117] "RemoveContainer" containerID="18852cff0b41a24899b69dacbe1f263b3676ce4ba6ee2f1d4d3b6291bb9f1dbe" Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.661084 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-7mp8g"] Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.664639 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-7mp8g"] Mar 20 10:59:54 crc kubenswrapper[4772]: I0320 10:59:54.675002 4772 scope.go:117] "RemoveContainer" containerID="4d3c5a2cb8492051edd44fb39984d3864e657aaf532571d686710e2c858a81d1" Mar 20 10:59:56 crc kubenswrapper[4772]: I0320 10:59:56.649540 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" path="/var/lib/kubelet/pods/3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe/volumes" Mar 20 10:59:59 crc kubenswrapper[4772]: I0320 10:59:59.482099 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 10:59:59 crc kubenswrapper[4772]: I0320 10:59:59.535348 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9wnx"] Mar 20 10:59:59 crc kubenswrapper[4772]: I0320 10:59:59.645867 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-f9wnx" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="registry-server" containerID="cri-o://fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7" gracePeriod=2 Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.145876 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566740-v69rd"] Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.146324 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="registry-server" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146364 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="registry-server" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.146388 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="extract-content" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146405 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="extract-content" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.146428 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="extract-utilities" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146445 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="extract-utilities" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.146474 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="extract-utilities" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146492 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="extract-utilities" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.146527 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="registry-server" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146546 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="registry-server" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.146575 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="extract-content" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146592 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="extract-content" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146939 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="38661b1d-4edd-438e-b69b-6e9f9c8a7d65" containerName="registry-server" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.146971 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bd2e5f3-52e7-4b15-bb08-d97ba4ec11fe" containerName="registry-server" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.147761 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.153400 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.155574 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.156141 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.158756 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf"] Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.160418 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.167508 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.168892 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.172856 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf"] Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.174005 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.189883 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-v69rd"] Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281501 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf9mw\" (UniqueName: \"kubernetes.io/projected/b1148455-d28e-4866-8b3e-cbabeaad84c7-kube-api-access-gf9mw\") pod \"b1148455-d28e-4866-8b3e-cbabeaad84c7\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281552 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-catalog-content\") pod \"b1148455-d28e-4866-8b3e-cbabeaad84c7\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281576 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-utilities\") pod \"b1148455-d28e-4866-8b3e-cbabeaad84c7\" (UID: \"b1148455-d28e-4866-8b3e-cbabeaad84c7\") " Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281774 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zxcf4\" (UniqueName: \"kubernetes.io/projected/fd765c99-6471-4286-9dfe-c647dd180320-kube-api-access-zxcf4\") pod \"auto-csr-approver-29566740-v69rd\" (UID: \"fd765c99-6471-4286-9dfe-c647dd180320\") " pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281810 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a950f69-dc46-4917-b204-c8195f937827-config-volume\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281864 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a950f69-dc46-4917-b204-c8195f937827-secret-volume\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.281879 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c2pw\" (UniqueName: \"kubernetes.io/projected/3a950f69-dc46-4917-b204-c8195f937827-kube-api-access-6c2pw\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.283681 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-utilities" (OuterVolumeSpecName: "utilities") pod "b1148455-d28e-4866-8b3e-cbabeaad84c7" (UID: "b1148455-d28e-4866-8b3e-cbabeaad84c7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.303270 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-rq497"] Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.307715 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1148455-d28e-4866-8b3e-cbabeaad84c7-kube-api-access-gf9mw" (OuterVolumeSpecName: "kube-api-access-gf9mw") pod "b1148455-d28e-4866-8b3e-cbabeaad84c7" (UID: "b1148455-d28e-4866-8b3e-cbabeaad84c7"). InnerVolumeSpecName "kube-api-access-gf9mw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.336074 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b1148455-d28e-4866-8b3e-cbabeaad84c7" (UID: "b1148455-d28e-4866-8b3e-cbabeaad84c7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383318 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a950f69-dc46-4917-b204-c8195f937827-secret-volume\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383362 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c2pw\" (UniqueName: \"kubernetes.io/projected/3a950f69-dc46-4917-b204-c8195f937827-kube-api-access-6c2pw\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383421 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zxcf4\" (UniqueName: \"kubernetes.io/projected/fd765c99-6471-4286-9dfe-c647dd180320-kube-api-access-zxcf4\") pod \"auto-csr-approver-29566740-v69rd\" (UID: \"fd765c99-6471-4286-9dfe-c647dd180320\") " pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383449 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a950f69-dc46-4917-b204-c8195f937827-config-volume\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383498 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf9mw\" (UniqueName: \"kubernetes.io/projected/b1148455-d28e-4866-8b3e-cbabeaad84c7-kube-api-access-gf9mw\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383507 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.383516 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b1148455-d28e-4866-8b3e-cbabeaad84c7-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.384300 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a950f69-dc46-4917-b204-c8195f937827-config-volume\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.388547 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a950f69-dc46-4917-b204-c8195f937827-secret-volume\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.400201 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zxcf4\" (UniqueName: \"kubernetes.io/projected/fd765c99-6471-4286-9dfe-c647dd180320-kube-api-access-zxcf4\") pod \"auto-csr-approver-29566740-v69rd\" (UID: \"fd765c99-6471-4286-9dfe-c647dd180320\") " pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.405296 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c2pw\" (UniqueName: \"kubernetes.io/projected/3a950f69-dc46-4917-b204-c8195f937827-kube-api-access-6c2pw\") pod \"collect-profiles-29566740-7ndqf\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.481734 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.487320 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.660462 4772 generic.go:334] "Generic (PLEG): container finished" podID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerID="fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7" exitCode=0 Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.660846 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-f9wnx" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.660851 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerDied","Data":"fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7"} Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.660947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-f9wnx" event={"ID":"b1148455-d28e-4866-8b3e-cbabeaad84c7","Type":"ContainerDied","Data":"4fa33889d623a5c1e7a178dc55801293a86ec6dbe5f716335c9a02e325b3e93f"} Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.660984 4772 scope.go:117] "RemoveContainer" containerID="fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.679476 4772 scope.go:117] "RemoveContainer" containerID="2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.690993 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-f9wnx"] Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.693714 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-f9wnx"] Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.719071 4772 scope.go:117] "RemoveContainer" containerID="49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.732169 4772 scope.go:117] "RemoveContainer" containerID="fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.732551 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7\": container with ID starting with fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7 not found: ID does not exist" containerID="fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.732591 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7"} err="failed to get container status \"fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7\": rpc error: code = NotFound desc = could not find container \"fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7\": container with ID starting with fce48148176162388920f46090f1d2894962e39da939608c948eb4a78fc34bd7 not found: ID does not exist" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.732620 4772 scope.go:117] "RemoveContainer" containerID="2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.732987 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661\": container with ID starting with 2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661 not found: ID does not exist" containerID="2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.733049 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661"} err="failed to get container status \"2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661\": rpc error: code = NotFound desc = could not find container \"2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661\": container with ID starting with 2dfe909affeb4b23f3027ea33ae59cd8b21b981df3f589fc587e8c3085f8c661 not found: ID does not exist" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.733098 4772 scope.go:117] "RemoveContainer" containerID="49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2" Mar 20 11:00:00 crc kubenswrapper[4772]: E0320 11:00:00.733479 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2\": container with ID starting with 49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2 not found: ID does not exist" containerID="49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.733504 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2"} err="failed to get container status \"49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2\": rpc error: code = NotFound desc = could not find container \"49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2\": container with ID starting with 49d40b1c23653dd7d7a664dc4fb33bbb4ddc94bf3d4782892e2a69fcc40fb2a2 not found: ID does not exist" Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.925373 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-v69rd"] Mar 20 11:00:00 crc kubenswrapper[4772]: W0320 11:00:00.930163 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd765c99_6471_4286_9dfe_c647dd180320.slice/crio-622169b7bf5078f0cbfce76d1059e2ed277c5d830c87ed13f9eb1d40591c7ab9 WatchSource:0}: Error finding container 622169b7bf5078f0cbfce76d1059e2ed277c5d830c87ed13f9eb1d40591c7ab9: Status 404 returned error can't find the container with id 622169b7bf5078f0cbfce76d1059e2ed277c5d830c87ed13f9eb1d40591c7ab9 Mar 20 11:00:00 crc kubenswrapper[4772]: I0320 11:00:00.989154 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf"] Mar 20 11:00:00 crc kubenswrapper[4772]: W0320 11:00:00.995617 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a950f69_dc46_4917_b204_c8195f937827.slice/crio-f4b37c6f782805af8e4ed5f9391d14578006461ecdb8d4c3e9d03342cc9820a7 WatchSource:0}: Error finding container f4b37c6f782805af8e4ed5f9391d14578006461ecdb8d4c3e9d03342cc9820a7: Status 404 returned error can't find the container with id f4b37c6f782805af8e4ed5f9391d14578006461ecdb8d4c3e9d03342cc9820a7 Mar 20 11:00:01 crc kubenswrapper[4772]: I0320 11:00:01.669135 4772 generic.go:334] "Generic (PLEG): container finished" podID="3a950f69-dc46-4917-b204-c8195f937827" containerID="f59ea5a510105fe60a16d016b3195a261750ad3d7095bbe1ac704d173bebe82c" exitCode=0 Mar 20 11:00:01 crc kubenswrapper[4772]: I0320 11:00:01.669190 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" event={"ID":"3a950f69-dc46-4917-b204-c8195f937827","Type":"ContainerDied","Data":"f59ea5a510105fe60a16d016b3195a261750ad3d7095bbe1ac704d173bebe82c"} Mar 20 11:00:01 crc kubenswrapper[4772]: I0320 11:00:01.669548 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" event={"ID":"3a950f69-dc46-4917-b204-c8195f937827","Type":"ContainerStarted","Data":"f4b37c6f782805af8e4ed5f9391d14578006461ecdb8d4c3e9d03342cc9820a7"} Mar 20 11:00:01 crc kubenswrapper[4772]: I0320 11:00:01.671828 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-v69rd" event={"ID":"fd765c99-6471-4286-9dfe-c647dd180320","Type":"ContainerStarted","Data":"622169b7bf5078f0cbfce76d1059e2ed277c5d830c87ed13f9eb1d40591c7ab9"} Mar 20 11:00:02 crc kubenswrapper[4772]: I0320 11:00:02.647508 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" path="/var/lib/kubelet/pods/b1148455-d28e-4866-8b3e-cbabeaad84c7/volumes" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.099272 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.226257 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a950f69-dc46-4917-b204-c8195f937827-secret-volume\") pod \"3a950f69-dc46-4917-b204-c8195f937827\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.226341 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a950f69-dc46-4917-b204-c8195f937827-config-volume\") pod \"3a950f69-dc46-4917-b204-c8195f937827\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.226434 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c2pw\" (UniqueName: \"kubernetes.io/projected/3a950f69-dc46-4917-b204-c8195f937827-kube-api-access-6c2pw\") pod \"3a950f69-dc46-4917-b204-c8195f937827\" (UID: \"3a950f69-dc46-4917-b204-c8195f937827\") " Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.227665 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3a950f69-dc46-4917-b204-c8195f937827-config-volume" (OuterVolumeSpecName: "config-volume") pod "3a950f69-dc46-4917-b204-c8195f937827" (UID: "3a950f69-dc46-4917-b204-c8195f937827"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.232506 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3a950f69-dc46-4917-b204-c8195f937827-kube-api-access-6c2pw" (OuterVolumeSpecName: "kube-api-access-6c2pw") pod "3a950f69-dc46-4917-b204-c8195f937827" (UID: "3a950f69-dc46-4917-b204-c8195f937827"). InnerVolumeSpecName "kube-api-access-6c2pw". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.233878 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3a950f69-dc46-4917-b204-c8195f937827-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "3a950f69-dc46-4917-b204-c8195f937827" (UID: "3a950f69-dc46-4917-b204-c8195f937827"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.327830 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c2pw\" (UniqueName: \"kubernetes.io/projected/3a950f69-dc46-4917-b204-c8195f937827-kube-api-access-6c2pw\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.327883 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/3a950f69-dc46-4917-b204-c8195f937827-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.327894 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/3a950f69-dc46-4917-b204-c8195f937827-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.688299 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" event={"ID":"3a950f69-dc46-4917-b204-c8195f937827","Type":"ContainerDied","Data":"f4b37c6f782805af8e4ed5f9391d14578006461ecdb8d4c3e9d03342cc9820a7"} Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.688368 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4b37c6f782805af8e4ed5f9391d14578006461ecdb8d4c3e9d03342cc9820a7" Mar 20 11:00:03 crc kubenswrapper[4772]: I0320 11:00:03.688429 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.287878 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bfc585969-cppp2"] Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.288998 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" podUID="824f2af7-476f-4b4e-96c5-4dcdcd159130" containerName="controller-manager" containerID="cri-o://75f29844894ee7c0e5e286ce1c2b84b0e96eb085f06f4e1c5349fb67cab00501" gracePeriod=30 Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.380773 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6"] Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.381272 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" podUID="6073a773-4b7a-4edf-a1d2-f559c26abc9c" containerName="route-controller-manager" containerID="cri-o://ece4f402a2093eb0df360f9f13102499075a4224a8599f5bcde2bb034a793ea8" gracePeriod=30 Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.696822 4772 generic.go:334] "Generic (PLEG): container finished" podID="824f2af7-476f-4b4e-96c5-4dcdcd159130" containerID="75f29844894ee7c0e5e286ce1c2b84b0e96eb085f06f4e1c5349fb67cab00501" exitCode=0 Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.696880 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" event={"ID":"824f2af7-476f-4b4e-96c5-4dcdcd159130","Type":"ContainerDied","Data":"75f29844894ee7c0e5e286ce1c2b84b0e96eb085f06f4e1c5349fb67cab00501"} Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.698345 4772 generic.go:334] "Generic (PLEG): container finished" podID="6073a773-4b7a-4edf-a1d2-f559c26abc9c" containerID="ece4f402a2093eb0df360f9f13102499075a4224a8599f5bcde2bb034a793ea8" exitCode=0 Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.698402 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" event={"ID":"6073a773-4b7a-4edf-a1d2-f559c26abc9c","Type":"ContainerDied","Data":"ece4f402a2093eb0df360f9f13102499075a4224a8599f5bcde2bb034a793ea8"} Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.699678 4772 generic.go:334] "Generic (PLEG): container finished" podID="fd765c99-6471-4286-9dfe-c647dd180320" containerID="7b724749b84b64887424fbe66d813cfe02190b9d27467daf6b4261992ae80790" exitCode=0 Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.699718 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-v69rd" event={"ID":"fd765c99-6471-4286-9dfe-c647dd180320","Type":"ContainerDied","Data":"7b724749b84b64887424fbe66d813cfe02190b9d27467daf6b4261992ae80790"} Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.812511 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.823813 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.950458 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s6mwj\" (UniqueName: \"kubernetes.io/projected/824f2af7-476f-4b4e-96c5-4dcdcd159130-kube-api-access-s6mwj\") pod \"824f2af7-476f-4b4e-96c5-4dcdcd159130\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.950917 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-client-ca\") pod \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.950950 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824f2af7-476f-4b4e-96c5-4dcdcd159130-serving-cert\") pod \"824f2af7-476f-4b4e-96c5-4dcdcd159130\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.950976 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-proxy-ca-bundles\") pod \"824f2af7-476f-4b4e-96c5-4dcdcd159130\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.951003 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-config\") pod \"824f2af7-476f-4b4e-96c5-4dcdcd159130\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.951069 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6073a773-4b7a-4edf-a1d2-f559c26abc9c-serving-cert\") pod \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.951115 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-client-ca\") pod \"824f2af7-476f-4b4e-96c5-4dcdcd159130\" (UID: \"824f2af7-476f-4b4e-96c5-4dcdcd159130\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.951152 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-config\") pod \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.951177 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lwpkk\" (UniqueName: \"kubernetes.io/projected/6073a773-4b7a-4edf-a1d2-f559c26abc9c-kube-api-access-lwpkk\") pod \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\" (UID: \"6073a773-4b7a-4edf-a1d2-f559c26abc9c\") " Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.952578 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-client-ca" (OuterVolumeSpecName: "client-ca") pod "824f2af7-476f-4b4e-96c5-4dcdcd159130" (UID: "824f2af7-476f-4b4e-96c5-4dcdcd159130"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.952676 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "824f2af7-476f-4b4e-96c5-4dcdcd159130" (UID: "824f2af7-476f-4b4e-96c5-4dcdcd159130"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.952695 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-config" (OuterVolumeSpecName: "config") pod "824f2af7-476f-4b4e-96c5-4dcdcd159130" (UID: "824f2af7-476f-4b4e-96c5-4dcdcd159130"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.952982 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-config" (OuterVolumeSpecName: "config") pod "6073a773-4b7a-4edf-a1d2-f559c26abc9c" (UID: "6073a773-4b7a-4edf-a1d2-f559c26abc9c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.955867 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6073a773-4b7a-4edf-a1d2-f559c26abc9c-kube-api-access-lwpkk" (OuterVolumeSpecName: "kube-api-access-lwpkk") pod "6073a773-4b7a-4edf-a1d2-f559c26abc9c" (UID: "6073a773-4b7a-4edf-a1d2-f559c26abc9c"). InnerVolumeSpecName "kube-api-access-lwpkk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.956097 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-client-ca" (OuterVolumeSpecName: "client-ca") pod "6073a773-4b7a-4edf-a1d2-f559c26abc9c" (UID: "6073a773-4b7a-4edf-a1d2-f559c26abc9c"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.957856 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/824f2af7-476f-4b4e-96c5-4dcdcd159130-kube-api-access-s6mwj" (OuterVolumeSpecName: "kube-api-access-s6mwj") pod "824f2af7-476f-4b4e-96c5-4dcdcd159130" (UID: "824f2af7-476f-4b4e-96c5-4dcdcd159130"). InnerVolumeSpecName "kube-api-access-s6mwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.960473 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6073a773-4b7a-4edf-a1d2-f559c26abc9c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6073a773-4b7a-4edf-a1d2-f559c26abc9c" (UID: "6073a773-4b7a-4edf-a1d2-f559c26abc9c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:04 crc kubenswrapper[4772]: I0320 11:00:04.960603 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/824f2af7-476f-4b4e-96c5-4dcdcd159130-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "824f2af7-476f-4b4e-96c5-4dcdcd159130" (UID: "824f2af7-476f-4b4e-96c5-4dcdcd159130"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053029 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053064 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053074 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lwpkk\" (UniqueName: \"kubernetes.io/projected/6073a773-4b7a-4edf-a1d2-f559c26abc9c-kube-api-access-lwpkk\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053083 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s6mwj\" (UniqueName: \"kubernetes.io/projected/824f2af7-476f-4b4e-96c5-4dcdcd159130-kube-api-access-s6mwj\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053092 4772 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6073a773-4b7a-4edf-a1d2-f559c26abc9c-client-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053101 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/824f2af7-476f-4b4e-96c5-4dcdcd159130-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053108 4772 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053118 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/824f2af7-476f-4b4e-96c5-4dcdcd159130-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.053126 4772 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6073a773-4b7a-4edf-a1d2-f559c26abc9c-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.711236 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" event={"ID":"6073a773-4b7a-4edf-a1d2-f559c26abc9c","Type":"ContainerDied","Data":"78a9193284a5760eda3d6f05c14ac605ee64ecd07f454cb6cc30dc68f61195e8"} Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.711298 4772 scope.go:117] "RemoveContainer" containerID="ece4f402a2093eb0df360f9f13102499075a4224a8599f5bcde2bb034a793ea8" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.711418 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.718222 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.718222 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-bfc585969-cppp2" event={"ID":"824f2af7-476f-4b4e-96c5-4dcdcd159130","Type":"ContainerDied","Data":"e50a0144842e1949fabed49bcdb8571281469571ebe7b756baf686a87b7e6e7e"} Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.746734 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6"] Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.750763 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7dd98b5c79-wb9t6"] Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.758217 4772 scope.go:117] "RemoveContainer" containerID="75f29844894ee7c0e5e286ce1c2b84b0e96eb085f06f4e1c5349fb67cab00501" Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.764369 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-bfc585969-cppp2"] Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.767566 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-bfc585969-cppp2"] Mar 20 11:00:05 crc kubenswrapper[4772]: I0320 11:00:05.974966 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.065259 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zxcf4\" (UniqueName: \"kubernetes.io/projected/fd765c99-6471-4286-9dfe-c647dd180320-kube-api-access-zxcf4\") pod \"fd765c99-6471-4286-9dfe-c647dd180320\" (UID: \"fd765c99-6471-4286-9dfe-c647dd180320\") " Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.068587 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd765c99-6471-4286-9dfe-c647dd180320-kube-api-access-zxcf4" (OuterVolumeSpecName: "kube-api-access-zxcf4") pod "fd765c99-6471-4286-9dfe-c647dd180320" (UID: "fd765c99-6471-4286-9dfe-c647dd180320"). InnerVolumeSpecName "kube-api-access-zxcf4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.166751 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zxcf4\" (UniqueName: \"kubernetes.io/projected/fd765c99-6471-4286-9dfe-c647dd180320-kube-api-access-zxcf4\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.315688 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-696d78dd6-skfkr"] Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316024 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="824f2af7-476f-4b4e-96c5-4dcdcd159130" containerName="controller-manager" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316040 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="824f2af7-476f-4b4e-96c5-4dcdcd159130" containerName="controller-manager" Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316062 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="extract-content" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316079 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="extract-content" Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316089 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6073a773-4b7a-4edf-a1d2-f559c26abc9c" containerName="route-controller-manager" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316096 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6073a773-4b7a-4edf-a1d2-f559c26abc9c" containerName="route-controller-manager" Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316103 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="registry-server" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316110 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="registry-server" Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316121 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3a950f69-dc46-4917-b204-c8195f937827" containerName="collect-profiles" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316128 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3a950f69-dc46-4917-b204-c8195f937827" containerName="collect-profiles" Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316136 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd765c99-6471-4286-9dfe-c647dd180320" containerName="oc" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316142 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd765c99-6471-4286-9dfe-c647dd180320" containerName="oc" Mar 20 11:00:06 crc kubenswrapper[4772]: E0320 11:00:06.316151 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="extract-utilities" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316158 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="extract-utilities" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316261 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3a950f69-dc46-4917-b204-c8195f937827" containerName="collect-profiles" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316272 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd765c99-6471-4286-9dfe-c647dd180320" containerName="oc" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316281 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="824f2af7-476f-4b4e-96c5-4dcdcd159130" containerName="controller-manager" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316297 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b1148455-d28e-4866-8b3e-cbabeaad84c7" containerName="registry-server" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316306 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6073a773-4b7a-4edf-a1d2-f559c26abc9c" containerName="route-controller-manager" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.316735 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.319667 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.320276 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.320599 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.320805 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.321002 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.321912 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.327913 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9"] Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.329112 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.331295 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.335103 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.336348 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.336531 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.336568 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.336623 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.337905 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.341142 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9"] Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.344499 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-696d78dd6-skfkr"] Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470347 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-serving-cert\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470406 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w67ts\" (UniqueName: \"kubernetes.io/projected/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-kube-api-access-w67ts\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470432 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-config\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470453 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-client-ca\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-serving-cert\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470493 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jsq7t\" (UniqueName: \"kubernetes.io/projected/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-kube-api-access-jsq7t\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470518 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-proxy-ca-bundles\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470691 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-client-ca\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.470821 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-config\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.572738 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w67ts\" (UniqueName: \"kubernetes.io/projected/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-kube-api-access-w67ts\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.572816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-config\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.572889 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-client-ca\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.572935 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-serving-cert\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.572976 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jsq7t\" (UniqueName: \"kubernetes.io/projected/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-kube-api-access-jsq7t\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.573024 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-proxy-ca-bundles\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.573076 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-client-ca\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.573145 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-config\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.573189 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-serving-cert\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.574190 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-client-ca\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.574639 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-client-ca\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.574791 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-proxy-ca-bundles\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.575801 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-config\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.576540 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-config\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.579139 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-serving-cert\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.579720 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-serving-cert\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.593691 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w67ts\" (UniqueName: \"kubernetes.io/projected/191d3ac3-bc81-48da-9e8e-d0dfa0156b40-kube-api-access-w67ts\") pod \"controller-manager-696d78dd6-skfkr\" (UID: \"191d3ac3-bc81-48da-9e8e-d0dfa0156b40\") " pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.594527 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jsq7t\" (UniqueName: \"kubernetes.io/projected/ab0d379a-89b2-4f2e-9791-ada6ff9901cd-kube-api-access-jsq7t\") pod \"route-controller-manager-6dbd5cf686-t4cl9\" (UID: \"ab0d379a-89b2-4f2e-9791-ada6ff9901cd\") " pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.635921 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.648360 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.650515 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6073a773-4b7a-4edf-a1d2-f559c26abc9c" path="/var/lib/kubelet/pods/6073a773-4b7a-4edf-a1d2-f559c26abc9c/volumes" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.651284 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="824f2af7-476f-4b4e-96c5-4dcdcd159130" path="/var/lib/kubelet/pods/824f2af7-476f-4b4e-96c5-4dcdcd159130/volumes" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.730944 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566740-v69rd" event={"ID":"fd765c99-6471-4286-9dfe-c647dd180320","Type":"ContainerDied","Data":"622169b7bf5078f0cbfce76d1059e2ed277c5d830c87ed13f9eb1d40591c7ab9"} Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.731025 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="622169b7bf5078f0cbfce76d1059e2ed277c5d830c87ed13f9eb1d40591c7ab9" Mar 20 11:00:06 crc kubenswrapper[4772]: I0320 11:00:06.731124 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566740-v69rd" Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.066082 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-696d78dd6-skfkr"] Mar 20 11:00:07 crc kubenswrapper[4772]: W0320 11:00:07.068490 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod191d3ac3_bc81_48da_9e8e_d0dfa0156b40.slice/crio-014d986e4904ff1cba96190a5f2d2a3c134ccf0488794ba6289b194fe6aad69b WatchSource:0}: Error finding container 014d986e4904ff1cba96190a5f2d2a3c134ccf0488794ba6289b194fe6aad69b: Status 404 returned error can't find the container with id 014d986e4904ff1cba96190a5f2d2a3c134ccf0488794ba6289b194fe6aad69b Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.176252 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9"] Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.736307 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" event={"ID":"ab0d379a-89b2-4f2e-9791-ada6ff9901cd","Type":"ContainerStarted","Data":"4ed136b1fd600ffe5fd7ecd60d242188ae91548f8f86a6908b692a21bcf47160"} Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.736356 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" event={"ID":"ab0d379a-89b2-4f2e-9791-ada6ff9901cd","Type":"ContainerStarted","Data":"8b6e0730543d321266b7553c32a7c14b405baccc4170e0bc690cd4a944cb0585"} Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.736564 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.737500 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" event={"ID":"191d3ac3-bc81-48da-9e8e-d0dfa0156b40","Type":"ContainerStarted","Data":"88ad4159b4ce04b974b6dca918d39fbf5885eaebe80acec880d54d8714a74ce8"} Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.737533 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" event={"ID":"191d3ac3-bc81-48da-9e8e-d0dfa0156b40","Type":"ContainerStarted","Data":"014d986e4904ff1cba96190a5f2d2a3c134ccf0488794ba6289b194fe6aad69b"} Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.738097 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.749335 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.749424 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" Mar 20 11:00:07 crc kubenswrapper[4772]: I0320 11:00:07.790535 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6dbd5cf686-t4cl9" podStartSLOduration=3.790514554 podStartE2EDuration="3.790514554s" podCreationTimestamp="2026-03-20 11:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:07.766072948 +0000 UTC m=+293.857039443" watchObservedRunningTime="2026-03-20 11:00:07.790514554 +0000 UTC m=+293.881481029" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.840186 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841254 4772 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841412 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841521 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518" gracePeriod=15 Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841569 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604" gracePeriod=15 Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841567 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61" gracePeriod=15 Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841535 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5" gracePeriod=15 Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841636 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed" gracePeriod=15 Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.841911 4772 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842043 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842058 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842068 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842075 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842083 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842091 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842100 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842108 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842117 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842125 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842135 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842143 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842150 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842156 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842164 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842170 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842180 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842186 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842274 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842282 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842289 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842299 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842306 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842313 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842320 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Mar 20 11:00:09 crc kubenswrapper[4772]: E0320 11:00:09.842407 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842416 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842525 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.842694 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.871141 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-696d78dd6-skfkr" podStartSLOduration=5.871123595 podStartE2EDuration="5.871123595s" podCreationTimestamp="2026-03-20 11:00:04 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:07.815199738 +0000 UTC m=+293.906166223" watchObservedRunningTime="2026-03-20 11:00:09.871123595 +0000 UTC m=+295.962090080" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.873990 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.910966 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.911069 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.911266 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.911435 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:09 crc kubenswrapper[4772]: I0320 11:00:09.911555 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.013531 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.013960 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.013979 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014047 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014074 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014104 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014130 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014164 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014189 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014193 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014214 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.014242 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.115191 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.115241 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.115259 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.115338 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.115359 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.115338 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.173019 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:10 crc kubenswrapper[4772]: W0320 11:00:10.193469 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-bf4abea8445d9d6980a09416bea187335e26c307ddd65e3cde97648c166511a5 WatchSource:0}: Error finding container bf4abea8445d9d6980a09416bea187335e26c307ddd65e3cde97648c166511a5: Status 404 returned error can't find the container with id bf4abea8445d9d6980a09416bea187335e26c307ddd65e3cde97648c166511a5 Mar 20 11:00:10 crc kubenswrapper[4772]: E0320 11:00:10.197049 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e879cabbaa0fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 11:00:10.196336891 +0000 UTC m=+296.287303376,LastTimestamp:2026-03-20 11:00:10.196336891 +0000 UTC m=+296.287303376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.759334 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-check-endpoints/3.log" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.762124 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.763246 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5" exitCode=0 Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.763289 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604" exitCode=0 Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.763304 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61" exitCode=0 Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.763319 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed" exitCode=2 Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.763366 4772 scope.go:117] "RemoveContainer" containerID="a70804152d3a197e8c1bcf73afca52f6d8c9cff00f842e2a4764a2a54053a5a5" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.766644 4772 generic.go:334] "Generic (PLEG): container finished" podID="c939fe35-51ef-40a4-951c-cebac7f55e8c" containerID="be2f87e1ea431564579692e2018584c5890503fe0a5261841cb20b3b711870b2" exitCode=0 Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.766746 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c939fe35-51ef-40a4-951c-cebac7f55e8c","Type":"ContainerDied","Data":"be2f87e1ea431564579692e2018584c5890503fe0a5261841cb20b3b711870b2"} Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.768112 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.768547 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.768936 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a"} Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.768983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"bf4abea8445d9d6980a09416bea187335e26c307ddd65e3cde97648c166511a5"} Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.770021 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:10 crc kubenswrapper[4772]: I0320 11:00:10.770708 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:11 crc kubenswrapper[4772]: I0320 11:00:11.777051 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.221667 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.222705 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.223088 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.227273 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.228175 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.228640 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.229049 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.229353 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345030 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345132 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345162 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345179 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-var-lock\") pod \"c939fe35-51ef-40a4-951c-cebac7f55e8c\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345162 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345202 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-kubelet-dir\") pod \"c939fe35-51ef-40a4-951c-cebac7f55e8c\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345229 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c939fe35-51ef-40a4-951c-cebac7f55e8c-kube-api-access\") pod \"c939fe35-51ef-40a4-951c-cebac7f55e8c\" (UID: \"c939fe35-51ef-40a4-951c-cebac7f55e8c\") " Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345228 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345242 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-var-lock" (OuterVolumeSpecName: "var-lock") pod "c939fe35-51ef-40a4-951c-cebac7f55e8c" (UID: "c939fe35-51ef-40a4-951c-cebac7f55e8c"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345250 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "c939fe35-51ef-40a4-951c-cebac7f55e8c" (UID: "c939fe35-51ef-40a4-951c-cebac7f55e8c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345286 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345420 4772 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345430 4772 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345439 4772 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/c939fe35-51ef-40a4-951c-cebac7f55e8c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345447 4772 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.345455 4772 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.349724 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c939fe35-51ef-40a4-951c-cebac7f55e8c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "c939fe35-51ef-40a4-951c-cebac7f55e8c" (UID: "c939fe35-51ef-40a4-951c-cebac7f55e8c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.446421 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/c939fe35-51ef-40a4-951c-cebac7f55e8c-kube-api-access\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.655786 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.790726 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.791728 4772 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518" exitCode=0 Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.791804 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.791927 4772 scope.go:117] "RemoveContainer" containerID="c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.792527 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.792882 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.793523 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.794268 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"c939fe35-51ef-40a4-951c-cebac7f55e8c","Type":"ContainerDied","Data":"b3515e42162d3b9bc3278cd0aea4d678ca9228cbfc3c6d8c7af9af63e68c87a3"} Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.794307 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.794313 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b3515e42162d3b9bc3278cd0aea4d678ca9228cbfc3c6d8c7af9af63e68c87a3" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.798043 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.798687 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.799191 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.803569 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.803966 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.804350 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.821569 4772 scope.go:117] "RemoveContainer" containerID="937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.843425 4772 scope.go:117] "RemoveContainer" containerID="44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.869475 4772 scope.go:117] "RemoveContainer" containerID="ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.892347 4772 scope.go:117] "RemoveContainer" containerID="907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.919481 4772 scope.go:117] "RemoveContainer" containerID="d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.948941 4772 scope.go:117] "RemoveContainer" containerID="c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5" Mar 20 11:00:12 crc kubenswrapper[4772]: E0320 11:00:12.949638 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\": container with ID starting with c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5 not found: ID does not exist" containerID="c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.949735 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5"} err="failed to get container status \"c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\": rpc error: code = NotFound desc = could not find container \"c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5\": container with ID starting with c66fdbd86b51ab9c6d407e72621698de67b47a8d5e9a9b02f54e7a7aea6c21c5 not found: ID does not exist" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.949780 4772 scope.go:117] "RemoveContainer" containerID="937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604" Mar 20 11:00:12 crc kubenswrapper[4772]: E0320 11:00:12.950288 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\": container with ID starting with 937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604 not found: ID does not exist" containerID="937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.950322 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604"} err="failed to get container status \"937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\": rpc error: code = NotFound desc = could not find container \"937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604\": container with ID starting with 937456c1e36bd65cc3b10ab267fe01419bf11ad7cc4d6917b9b36c8f7c559604 not found: ID does not exist" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.950351 4772 scope.go:117] "RemoveContainer" containerID="44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61" Mar 20 11:00:12 crc kubenswrapper[4772]: E0320 11:00:12.950716 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\": container with ID starting with 44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61 not found: ID does not exist" containerID="44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.950799 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61"} err="failed to get container status \"44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\": rpc error: code = NotFound desc = could not find container \"44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61\": container with ID starting with 44fc5a1a16ba4e73d68b9222961948fc842edcc19e20cd0cc65275d91a2cac61 not found: ID does not exist" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.950977 4772 scope.go:117] "RemoveContainer" containerID="ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed" Mar 20 11:00:12 crc kubenswrapper[4772]: E0320 11:00:12.951773 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\": container with ID starting with ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed not found: ID does not exist" containerID="ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.951824 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed"} err="failed to get container status \"ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\": rpc error: code = NotFound desc = could not find container \"ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed\": container with ID starting with ec3da7be4d2d42e5896dcf470ef9bc31c94a2ee1381d70256640f4017b7481ed not found: ID does not exist" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.951889 4772 scope.go:117] "RemoveContainer" containerID="907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518" Mar 20 11:00:12 crc kubenswrapper[4772]: E0320 11:00:12.952752 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\": container with ID starting with 907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518 not found: ID does not exist" containerID="907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.952936 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518"} err="failed to get container status \"907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\": rpc error: code = NotFound desc = could not find container \"907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518\": container with ID starting with 907d8f172368762bab9b5e8656c856f7a778230c5eac1a67cd47a01fffb8f518 not found: ID does not exist" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.952980 4772 scope.go:117] "RemoveContainer" containerID="d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e" Mar 20 11:00:12 crc kubenswrapper[4772]: E0320 11:00:12.953611 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\": container with ID starting with d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e not found: ID does not exist" containerID="d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e" Mar 20 11:00:12 crc kubenswrapper[4772]: I0320 11:00:12.953758 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e"} err="failed to get container status \"d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\": rpc error: code = NotFound desc = could not find container \"d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e\": container with ID starting with d49da46bc7c842ced4dacb4e7a10256d44da12cb919e1bce18808c6d50c6076e not found: ID does not exist" Mar 20 11:00:14 crc kubenswrapper[4772]: I0320 11:00:14.646674 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:14 crc kubenswrapper[4772]: I0320 11:00:14.647470 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:14 crc kubenswrapper[4772]: I0320 11:00:14.648987 4772 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:16 crc kubenswrapper[4772]: E0320 11:00:16.939073 4772 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.119:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.189e879cabbaa0fb openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-03-20 11:00:10.196336891 +0000 UTC m=+296.287303376,LastTimestamp:2026-03-20 11:00:10.196336891 +0000 UTC m=+296.287303376,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Mar 20 11:00:19 crc kubenswrapper[4772]: E0320 11:00:19.939974 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:19 crc kubenswrapper[4772]: E0320 11:00:19.940976 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:19 crc kubenswrapper[4772]: E0320 11:00:19.941529 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:19 crc kubenswrapper[4772]: E0320 11:00:19.942186 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:19 crc kubenswrapper[4772]: E0320 11:00:19.942636 4772 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:19 crc kubenswrapper[4772]: I0320 11:00:19.942692 4772 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Mar 20 11:00:19 crc kubenswrapper[4772]: E0320 11:00:19.943234 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="200ms" Mar 20 11:00:20 crc kubenswrapper[4772]: E0320 11:00:20.144488 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="400ms" Mar 20 11:00:20 crc kubenswrapper[4772]: E0320 11:00:20.546282 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="800ms" Mar 20 11:00:21 crc kubenswrapper[4772]: E0320 11:00:21.347764 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="1.6s" Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.641514 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.642871 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.643327 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.656458 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.656481 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:21 crc kubenswrapper[4772]: E0320 11:00:21.656794 4772 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.657243 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:21 crc kubenswrapper[4772]: W0320 11:00:21.678982 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-b67ac6ab0adec2ab040427fc9c1ffbd207197ee7c19850b29468be9109fcddd1 WatchSource:0}: Error finding container b67ac6ab0adec2ab040427fc9c1ffbd207197ee7c19850b29468be9109fcddd1: Status 404 returned error can't find the container with id b67ac6ab0adec2ab040427fc9c1ffbd207197ee7c19850b29468be9109fcddd1 Mar 20 11:00:21 crc kubenswrapper[4772]: I0320 11:00:21.863524 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b67ac6ab0adec2ab040427fc9c1ffbd207197ee7c19850b29468be9109fcddd1"} Mar 20 11:00:22 crc kubenswrapper[4772]: I0320 11:00:22.874600 4772 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="b1495592fbf1c0db62012aa14cdf3fc5c8fc2be9bdfffc088930302152ece848" exitCode=0 Mar 20 11:00:22 crc kubenswrapper[4772]: I0320 11:00:22.874675 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"b1495592fbf1c0db62012aa14cdf3fc5c8fc2be9bdfffc088930302152ece848"} Mar 20 11:00:22 crc kubenswrapper[4772]: I0320 11:00:22.875076 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:22 crc kubenswrapper[4772]: I0320 11:00:22.875108 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:22 crc kubenswrapper[4772]: E0320 11:00:22.875668 4772 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:22 crc kubenswrapper[4772]: I0320 11:00:22.875963 4772 status_manager.go:851] "Failed to get status for pod" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:22 crc kubenswrapper[4772]: I0320 11:00:22.876531 4772 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.119:6443: connect: connection refused" Mar 20 11:00:22 crc kubenswrapper[4772]: E0320 11:00:22.949135 4772 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.119:6443: connect: connection refused" interval="3.2s" Mar 20 11:00:23 crc kubenswrapper[4772]: I0320 11:00:23.889899 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"2f17e467603d283cbabcff8f26e5beaa3b64d8d37587ae593e2cad84700a56dc"} Mar 20 11:00:23 crc kubenswrapper[4772]: I0320 11:00:23.889942 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"daa33bbb28a69d95a3b76433bc82ce895f03284348b07edb812860b6603fd664"} Mar 20 11:00:23 crc kubenswrapper[4772]: I0320 11:00:23.889954 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"14a13c172ea8d5dd8eb0caf9afed8183f24414ba8f9b591caf25b356e7b7ff09"} Mar 20 11:00:23 crc kubenswrapper[4772]: I0320 11:00:23.889965 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"026bdc23ad6ba65b76cd70c64cdaab8889f9b3ecfa6ab0faf23580d17c0557bd"} Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.897185 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"01ae05142d1d5ec9eaf82d9d52eab5e07ac1816a08212affc1f95097512f6603"} Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.897494 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.897640 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.897663 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.900249 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.901076 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.901119 4772 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f" exitCode=1 Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.901143 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f"} Mar 20 11:00:24 crc kubenswrapper[4772]: I0320 11:00:24.901881 4772 scope.go:117] "RemoveContainer" containerID="397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.342174 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" podUID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" containerName="oauth-openshift" containerID="cri-o://2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb" gracePeriod=15 Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.828920 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.908821 4772 generic.go:334] "Generic (PLEG): container finished" podID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" containerID="2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb" exitCode=0 Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.908913 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.908925 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" event={"ID":"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5","Type":"ContainerDied","Data":"2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb"} Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.908950 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-rq497" event={"ID":"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5","Type":"ContainerDied","Data":"f50ba9df3a18bb3155114b346bc96ce3aeb593a4406d18410ed48f830029ac02"} Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.908967 4772 scope.go:117] "RemoveContainer" containerID="2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.911645 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.912183 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.912235 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"7545285744ca0bf9924d83b7123de212a348ad73a95f9c983e3653fdad956551"} Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.923254 4772 scope.go:117] "RemoveContainer" containerID="2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb" Mar 20 11:00:25 crc kubenswrapper[4772]: E0320 11:00:25.923571 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb\": container with ID starting with 2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb not found: ID does not exist" containerID="2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.923619 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb"} err="failed to get container status \"2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb\": rpc error: code = NotFound desc = could not find container \"2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb\": container with ID starting with 2a96d8ba8538a608dd5d69a05d01892a51f204b2243cfc3a0c5a905b3ee91beb not found: ID does not exist" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938062 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-trusted-ca-bundle\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938095 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-session\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938124 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-service-ca\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938147 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-provider-selection\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938177 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-ocp-branding-template\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938209 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-serving-cert\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938227 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-policies\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938252 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-error\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938947 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938987 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-dir\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938963 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939006 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939025 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xxmn8\" (UniqueName: \"kubernetes.io/projected/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-kube-api-access-xxmn8\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.938999 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939074 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-router-certs\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939099 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-login\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939126 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-idp-0-file-data\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939154 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-cliconfig\") pod \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\" (UID: \"0fc51adf-8a0a-4993-8f2a-dcac261eb2f5\") " Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939315 4772 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-policies\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939328 4772 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-audit-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939338 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939348 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.939785 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.944429 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.944584 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.944657 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-kube-api-access-xxmn8" (OuterVolumeSpecName: "kube-api-access-xxmn8") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "kube-api-access-xxmn8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.944985 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.945432 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.946161 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.950309 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.950606 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:25 crc kubenswrapper[4772]: I0320 11:00:25.950874 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" (UID: "0fc51adf-8a0a-4993-8f2a-dcac261eb2f5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040638 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xxmn8\" (UniqueName: \"kubernetes.io/projected/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-kube-api-access-xxmn8\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040706 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040732 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040760 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040787 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040813 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040871 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040901 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040926 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.040949 4772 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.658446 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.658757 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.666430 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:26 crc kubenswrapper[4772]: I0320 11:00:26.973776 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:27 crc kubenswrapper[4772]: I0320 11:00:27.986248 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:27 crc kubenswrapper[4772]: I0320 11:00:27.986402 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:27 crc kubenswrapper[4772]: I0320 11:00:27.987050 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:29 crc kubenswrapper[4772]: I0320 11:00:29.909302 4772 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:29 crc kubenswrapper[4772]: I0320 11:00:29.939155 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:29 crc kubenswrapper[4772]: I0320 11:00:29.939193 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:29 crc kubenswrapper[4772]: I0320 11:00:29.947139 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:29 crc kubenswrapper[4772]: I0320 11:00:29.950873 4772 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="33288a27-04b0-4a0b-a804-1d5c40abd276" Mar 20 11:00:30 crc kubenswrapper[4772]: I0320 11:00:30.945352 4772 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:30 crc kubenswrapper[4772]: I0320 11:00:30.945806 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="69ce1b19-6ab4-4f21-bf6b-ffe4eca38794" Mar 20 11:00:34 crc kubenswrapper[4772]: I0320 11:00:34.676377 4772 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="33288a27-04b0-4a0b-a804-1d5c40abd276" Mar 20 11:00:37 crc kubenswrapper[4772]: I0320 11:00:37.987399 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:37 crc kubenswrapper[4772]: I0320 11:00:37.987525 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:40 crc kubenswrapper[4772]: I0320 11:00:40.852337 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Mar 20 11:00:41 crc kubenswrapper[4772]: I0320 11:00:41.097822 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Mar 20 11:00:41 crc kubenswrapper[4772]: I0320 11:00:41.491488 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Mar 20 11:00:41 crc kubenswrapper[4772]: I0320 11:00:41.990192 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4772]: I0320 11:00:42.529158 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Mar 20 11:00:42 crc kubenswrapper[4772]: I0320 11:00:42.602038 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Mar 20 11:00:42 crc kubenswrapper[4772]: I0320 11:00:42.668627 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Mar 20 11:00:42 crc kubenswrapper[4772]: I0320 11:00:42.713334 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4772]: I0320 11:00:42.884615 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Mar 20 11:00:42 crc kubenswrapper[4772]: I0320 11:00:42.994782 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.004041 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.007955 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.113162 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.143181 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.188011 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.218778 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.525912 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.617770 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.619914 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.708211 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.781122 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Mar 20 11:00:43 crc kubenswrapper[4772]: I0320 11:00:43.973813 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.033315 4772 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.116161 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.146537 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.192175 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.352946 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.376866 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.469890 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.493681 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.507838 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.593072 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.598251 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.639926 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.692538 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.785816 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.789333 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Mar 20 11:00:44 crc kubenswrapper[4772]: I0320 11:00:44.867706 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.122021 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.353829 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.441233 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.566365 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.571349 4772 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.572948 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=36.572921858 podStartE2EDuration="36.572921858s" podCreationTimestamp="2026-03-20 11:00:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:29.714959135 +0000 UTC m=+315.805925650" watchObservedRunningTime="2026-03-20 11:00:45.572921858 +0000 UTC m=+331.663888383" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.579314 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-558db77b4-rq497"] Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.579412 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.588493 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.605445 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=16.60534553 podStartE2EDuration="16.60534553s" podCreationTimestamp="2026-03-20 11:00:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:45.59821542 +0000 UTC m=+331.689181975" watchObservedRunningTime="2026-03-20 11:00:45.60534553 +0000 UTC m=+331.696312055" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.701387 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.745927 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.835135 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.855276 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Mar 20 11:00:45 crc kubenswrapper[4772]: I0320 11:00:45.981466 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.090686 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.100293 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.105459 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.149455 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.324926 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.395115 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.407996 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.518623 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.538273 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.559821 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.590816 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.612596 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.654560 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" path="/var/lib/kubelet/pods/0fc51adf-8a0a-4993-8f2a-dcac261eb2f5/volumes" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.691017 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.699344 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.706825 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.800429 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.916958 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.941958 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Mar 20 11:00:46 crc kubenswrapper[4772]: I0320 11:00:46.983227 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.042336 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.065960 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.089890 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.122008 4772 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.198807 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.272351 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.299924 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.334765 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.336457 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.400895 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.438189 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.479448 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.711347 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.720490 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.783968 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.831117 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.905774 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.946058 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.986293 4772 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/kube-controller-manager namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" start-of-body= Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.986355 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" probeResult="failure" output="Get \"https://192.168.126.11:10257/healthz\": dial tcp 192.168.126.11:10257: connect: connection refused" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.986405 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.986855 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="kube-controller-manager" containerStatusID={"Type":"cri-o","ID":"7545285744ca0bf9924d83b7123de212a348ad73a95f9c983e3653fdad956551"} pod="openshift-kube-controller-manager/kube-controller-manager-crc" containerMessage="Container kube-controller-manager failed startup probe, will be restarted" Mar 20 11:00:47 crc kubenswrapper[4772]: I0320 11:00:47.986969 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="kube-controller-manager" containerID="cri-o://7545285744ca0bf9924d83b7123de212a348ad73a95f9c983e3653fdad956551" gracePeriod=30 Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.033622 4772 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.075174 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.101912 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.128343 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.172233 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.310153 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.340480 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.348347 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.382363 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.409770 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.422141 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.505044 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.512641 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.563199 4772 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.733749 4772 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.851322 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Mar 20 11:00:48 crc kubenswrapper[4772]: I0320 11:00:48.933671 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.020869 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.025418 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.028456 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.051077 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.069295 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-76544b44f9-sbk8x"] Mar 20 11:00:49 crc kubenswrapper[4772]: E0320 11:00:49.069575 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" containerName="installer" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.069595 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" containerName="installer" Mar 20 11:00:49 crc kubenswrapper[4772]: E0320 11:00:49.069637 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" containerName="oauth-openshift" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.069649 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" containerName="oauth-openshift" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.069808 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="0fc51adf-8a0a-4993-8f2a-dcac261eb2f5" containerName="oauth-openshift" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.069832 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c939fe35-51ef-40a4-951c-cebac7f55e8c" containerName="installer" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.070382 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.074977 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.075171 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.075954 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.080512 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.080575 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.081016 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.082031 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.082326 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.082557 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.082760 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.082829 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.084179 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.092337 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.113526 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.121654 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.126730 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210582 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2fa198ab-063a-4169-a374-9fbad50538ca-audit-dir\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210680 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210715 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-session\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210773 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xl8k\" (UniqueName: \"kubernetes.io/projected/2fa198ab-063a-4169-a374-9fbad50538ca-kube-api-access-7xl8k\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210795 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-audit-policies\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210881 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-login\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210907 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.210983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.211010 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.211065 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-error\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.211086 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.211114 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.211137 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.211212 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.218622 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.257700 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.290414 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312752 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7xl8k\" (UniqueName: \"kubernetes.io/projected/2fa198ab-063a-4169-a374-9fbad50538ca-kube-api-access-7xl8k\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312800 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-audit-policies\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312831 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-login\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312888 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312917 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312947 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-error\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312973 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.312997 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313029 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313059 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313106 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313161 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2fa198ab-063a-4169-a374-9fbad50538ca-audit-dir\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313191 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313223 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-session\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.313894 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-audit-policies\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.315507 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/2fa198ab-063a-4169-a374-9fbad50538ca-audit-dir\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.316622 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-service-ca\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.317138 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-cliconfig\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.317754 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.322376 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-router-certs\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.323659 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.324404 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.324738 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-session\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.325997 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.327025 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-error\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.327358 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.328669 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-system-serving-cert\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.338739 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/2fa198ab-063a-4169-a374-9fbad50538ca-v4-0-config-user-template-login\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.346612 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xl8k\" (UniqueName: \"kubernetes.io/projected/2fa198ab-063a-4169-a374-9fbad50538ca-kube-api-access-7xl8k\") pod \"oauth-openshift-76544b44f9-sbk8x\" (UID: \"2fa198ab-063a-4169-a374-9fbad50538ca\") " pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.407604 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.427824 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.505798 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.629478 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.633762 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.634472 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.651634 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.777989 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.778783 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.855823 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.893167 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.953229 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Mar 20 11:00:49 crc kubenswrapper[4772]: I0320 11:00:49.971476 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.017858 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.035510 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.115346 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.286598 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.296116 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.309979 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.417174 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.484710 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.544205 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.584760 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.609684 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.617478 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.696403 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.709209 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.713391 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.819793 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.822483 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.878316 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Mar 20 11:00:50 crc kubenswrapper[4772]: I0320 11:00:50.965085 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.016981 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.130148 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.131640 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.161251 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.193365 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.223322 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.242674 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.248420 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.255571 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.271123 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.289274 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.342546 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.365979 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.431148 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.431192 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.437102 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.484624 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.492670 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.497571 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.570278 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.573324 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.625184 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.654999 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.734236 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.775099 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.784387 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.803592 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Mar 20 11:00:51 crc kubenswrapper[4772]: I0320 11:00:51.825607 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76544b44f9-sbk8x"] Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.012163 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.055680 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.139447 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.155051 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.263029 4772 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.263567 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a" gracePeriod=5 Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.292381 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-76544b44f9-sbk8x"] Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.316069 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.337592 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.491685 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.526403 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.545293 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.658116 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.738533 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.788191 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.857292 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.922824 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.962632 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.986480 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Mar 20 11:00:52 crc kubenswrapper[4772]: I0320 11:00:52.997884 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.035831 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.149672 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" event={"ID":"2fa198ab-063a-4169-a374-9fbad50538ca","Type":"ContainerStarted","Data":"427ec05b575ee7a20d982bd1287c8ca6ef283025fb211e6ffaabe27fdff4660e"} Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.149726 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" event={"ID":"2fa198ab-063a-4169-a374-9fbad50538ca","Type":"ContainerStarted","Data":"4efe6bdb63f023fffe306fb25aa5fe94aa8ec305d12aaabe1505fc32144e0a63"} Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.149993 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.157319 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.159301 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.160581 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.182699 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-76544b44f9-sbk8x" podStartSLOduration=53.182669259 podStartE2EDuration="53.182669259s" podCreationTimestamp="2026-03-20 11:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:00:53.169001304 +0000 UTC m=+339.259967819" watchObservedRunningTime="2026-03-20 11:00:53.182669259 +0000 UTC m=+339.273635784" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.185089 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.237443 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.296518 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.493617 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.557171 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.568322 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.701740 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.766700 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.770179 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.789523 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Mar 20 11:00:53 crc kubenswrapper[4772]: I0320 11:00:53.906217 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.044110 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.055753 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.081557 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.104475 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.259092 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.340042 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.358306 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.516016 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.767750 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.779462 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.793071 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.848999 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Mar 20 11:00:54 crc kubenswrapper[4772]: I0320 11:00:54.895694 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.071720 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.140264 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.257058 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.591523 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.649307 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.685368 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.725649 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.795681 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.842235 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.873080 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Mar 20 11:00:55 crc kubenswrapper[4772]: I0320 11:00:55.876951 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Mar 20 11:00:56 crc kubenswrapper[4772]: I0320 11:00:56.194317 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Mar 20 11:00:56 crc kubenswrapper[4772]: I0320 11:00:56.249615 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Mar 20 11:00:56 crc kubenswrapper[4772]: I0320 11:00:56.282365 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Mar 20 11:00:56 crc kubenswrapper[4772]: I0320 11:00:56.496822 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Mar 20 11:00:56 crc kubenswrapper[4772]: I0320 11:00:56.718745 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.060948 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.458761 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.654544 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.835957 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.841106 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.841165 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942005 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942108 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942144 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942181 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942243 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942325 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942334 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942394 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942517 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.942992 4772 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.943008 4772 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.943019 4772 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.943029 4772 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:57 crc kubenswrapper[4772]: I0320 11:00:57.951189 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.044202 4772 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.181359 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.181444 4772 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a" exitCode=137 Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.181511 4772 scope.go:117] "RemoveContainer" containerID="61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.181533 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.203504 4772 scope.go:117] "RemoveContainer" containerID="61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a" Mar 20 11:00:58 crc kubenswrapper[4772]: E0320 11:00:58.204082 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a\": container with ID starting with 61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a not found: ID does not exist" containerID="61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.204132 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a"} err="failed to get container status \"61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a\": rpc error: code = NotFound desc = could not find container \"61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a\": container with ID starting with 61b4bf24236a4a34419dd58a2e0bba6f9bc3be992d013738493592a17fda5e2a not found: ID does not exist" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.274334 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.404532 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.653411 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.654269 4772 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.671628 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.671689 4772 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3dd24731-5b64-4fba-a52a-495c863ddb4c" Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.679892 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Mar 20 11:00:58 crc kubenswrapper[4772]: I0320 11:00:58.680206 4772 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="3dd24731-5b64-4fba-a52a-495c863ddb4c" Mar 20 11:00:59 crc kubenswrapper[4772]: I0320 11:00:59.076346 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Mar 20 11:00:59 crc kubenswrapper[4772]: I0320 11:00:59.565175 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.316029 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.319102 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.320165 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.320268 4772 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="7545285744ca0bf9924d83b7123de212a348ad73a95f9c983e3653fdad956551" exitCode=137 Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.320344 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"7545285744ca0bf9924d83b7123de212a348ad73a95f9c983e3653fdad956551"} Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.320467 4772 scope.go:117] "RemoveContainer" containerID="397c2d42c2cf93cbdead9e638d3520dea1b0c21d7eef811a996ef4b738221a8f" Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.323566 4772 generic.go:334] "Generic (PLEG): container finished" podID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerID="6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba" exitCode=0 Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.323615 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" event={"ID":"77c87234-b79b-4d2f-8ee3-b14aa050925a","Type":"ContainerDied","Data":"6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba"} Mar 20 11:01:18 crc kubenswrapper[4772]: I0320 11:01:18.324555 4772 scope.go:117] "RemoveContainer" containerID="6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba" Mar 20 11:01:19 crc kubenswrapper[4772]: I0320 11:01:19.331073 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" event={"ID":"77c87234-b79b-4d2f-8ee3-b14aa050925a","Type":"ContainerStarted","Data":"9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0"} Mar 20 11:01:19 crc kubenswrapper[4772]: I0320 11:01:19.332179 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 11:01:19 crc kubenswrapper[4772]: I0320 11:01:19.334529 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/1.log" Mar 20 11:01:19 crc kubenswrapper[4772]: I0320 11:01:19.335326 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 11:01:19 crc kubenswrapper[4772]: I0320 11:01:19.336051 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/cluster-policy-controller/0.log" Mar 20 11:01:19 crc kubenswrapper[4772]: I0320 11:01:19.336094 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"2263a649cd50508f0b3d0404c35a4fa1adbfae83bf7b630f11bc4ac8e537b3b8"} Mar 20 11:01:26 crc kubenswrapper[4772]: I0320 11:01:26.974059 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:27 crc kubenswrapper[4772]: I0320 11:01:27.986790 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:27 crc kubenswrapper[4772]: I0320 11:01:27.992607 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:01:28 crc kubenswrapper[4772]: I0320 11:01:28.388118 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.201548 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566742-fl9lv"] Mar 20 11:02:00 crc kubenswrapper[4772]: E0320 11:02:00.202953 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.203018 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.203296 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.204100 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.207133 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.207351 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.208243 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.215395 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-fl9lv"] Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.256121 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5vgk\" (UniqueName: \"kubernetes.io/projected/5590a77f-c988-4799-8fa8-ffb41d9153f2-kube-api-access-g5vgk\") pod \"auto-csr-approver-29566742-fl9lv\" (UID: \"5590a77f-c988-4799-8fa8-ffb41d9153f2\") " pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.357809 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5vgk\" (UniqueName: \"kubernetes.io/projected/5590a77f-c988-4799-8fa8-ffb41d9153f2-kube-api-access-g5vgk\") pod \"auto-csr-approver-29566742-fl9lv\" (UID: \"5590a77f-c988-4799-8fa8-ffb41d9153f2\") " pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.379315 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5vgk\" (UniqueName: \"kubernetes.io/projected/5590a77f-c988-4799-8fa8-ffb41d9153f2-kube-api-access-g5vgk\") pod \"auto-csr-approver-29566742-fl9lv\" (UID: \"5590a77f-c988-4799-8fa8-ffb41d9153f2\") " pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:00 crc kubenswrapper[4772]: I0320 11:02:00.530624 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:01 crc kubenswrapper[4772]: I0320 11:02:01.006790 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-fl9lv"] Mar 20 11:02:01 crc kubenswrapper[4772]: W0320 11:02:01.016587 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod5590a77f_c988_4799_8fa8_ffb41d9153f2.slice/crio-defd135f0166946f31f47da756c560fe1829022d289158e1eacbe526ca77ab54 WatchSource:0}: Error finding container defd135f0166946f31f47da756c560fe1829022d289158e1eacbe526ca77ab54: Status 404 returned error can't find the container with id defd135f0166946f31f47da756c560fe1829022d289158e1eacbe526ca77ab54 Mar 20 11:02:01 crc kubenswrapper[4772]: I0320 11:02:01.570728 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" event={"ID":"5590a77f-c988-4799-8fa8-ffb41d9153f2","Type":"ContainerStarted","Data":"defd135f0166946f31f47da756c560fe1829022d289158e1eacbe526ca77ab54"} Mar 20 11:02:02 crc kubenswrapper[4772]: I0320 11:02:02.577385 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" event={"ID":"5590a77f-c988-4799-8fa8-ffb41d9153f2","Type":"ContainerStarted","Data":"4ec9887e0ae666ed50a471610a9fbc68993fbf0bc75de98e2f2ff14b6350cb7f"} Mar 20 11:02:02 crc kubenswrapper[4772]: I0320 11:02:02.592440 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" podStartSLOduration=1.339883404 podStartE2EDuration="2.592396593s" podCreationTimestamp="2026-03-20 11:02:00 +0000 UTC" firstStartedPulling="2026-03-20 11:02:01.019054655 +0000 UTC m=+407.110021150" lastFinishedPulling="2026-03-20 11:02:02.271567854 +0000 UTC m=+408.362534339" observedRunningTime="2026-03-20 11:02:02.589039489 +0000 UTC m=+408.680005974" watchObservedRunningTime="2026-03-20 11:02:02.592396593 +0000 UTC m=+408.683363078" Mar 20 11:02:03 crc kubenswrapper[4772]: I0320 11:02:03.584906 4772 generic.go:334] "Generic (PLEG): container finished" podID="5590a77f-c988-4799-8fa8-ffb41d9153f2" containerID="4ec9887e0ae666ed50a471610a9fbc68993fbf0bc75de98e2f2ff14b6350cb7f" exitCode=0 Mar 20 11:02:03 crc kubenswrapper[4772]: I0320 11:02:03.585001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" event={"ID":"5590a77f-c988-4799-8fa8-ffb41d9153f2","Type":"ContainerDied","Data":"4ec9887e0ae666ed50a471610a9fbc68993fbf0bc75de98e2f2ff14b6350cb7f"} Mar 20 11:02:04 crc kubenswrapper[4772]: I0320 11:02:04.869131 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:04 crc kubenswrapper[4772]: I0320 11:02:04.917153 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5vgk\" (UniqueName: \"kubernetes.io/projected/5590a77f-c988-4799-8fa8-ffb41d9153f2-kube-api-access-g5vgk\") pod \"5590a77f-c988-4799-8fa8-ffb41d9153f2\" (UID: \"5590a77f-c988-4799-8fa8-ffb41d9153f2\") " Mar 20 11:02:04 crc kubenswrapper[4772]: I0320 11:02:04.925113 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5590a77f-c988-4799-8fa8-ffb41d9153f2-kube-api-access-g5vgk" (OuterVolumeSpecName: "kube-api-access-g5vgk") pod "5590a77f-c988-4799-8fa8-ffb41d9153f2" (UID: "5590a77f-c988-4799-8fa8-ffb41d9153f2"). InnerVolumeSpecName "kube-api-access-g5vgk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:05 crc kubenswrapper[4772]: I0320 11:02:05.018756 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5vgk\" (UniqueName: \"kubernetes.io/projected/5590a77f-c988-4799-8fa8-ffb41d9153f2-kube-api-access-g5vgk\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:05 crc kubenswrapper[4772]: I0320 11:02:05.602230 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" event={"ID":"5590a77f-c988-4799-8fa8-ffb41d9153f2","Type":"ContainerDied","Data":"defd135f0166946f31f47da756c560fe1829022d289158e1eacbe526ca77ab54"} Mar 20 11:02:05 crc kubenswrapper[4772]: I0320 11:02:05.602284 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566742-fl9lv" Mar 20 11:02:05 crc kubenswrapper[4772]: I0320 11:02:05.602287 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="defd135f0166946f31f47da756c560fe1829022d289158e1eacbe526ca77ab54" Mar 20 11:02:09 crc kubenswrapper[4772]: I0320 11:02:09.564951 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:09 crc kubenswrapper[4772]: I0320 11:02:09.566103 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.839648 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hxrz2"] Mar 20 11:02:15 crc kubenswrapper[4772]: E0320 11:02:15.840637 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5590a77f-c988-4799-8fa8-ffb41d9153f2" containerName="oc" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.840654 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5590a77f-c988-4799-8fa8-ffb41d9153f2" containerName="oc" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.840777 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5590a77f-c988-4799-8fa8-ffb41d9153f2" containerName="oc" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.841256 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.851808 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hxrz2"] Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.883979 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77b99a21-e979-4c30-99b2-b78ff2439ef7-registry-certificates\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884104 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884174 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-registry-tls\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884204 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77b99a21-e979-4c30-99b2-b78ff2439ef7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884277 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-bound-sa-token\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884322 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77b99a21-e979-4c30-99b2-b78ff2439ef7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884402 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77b99a21-e979-4c30-99b2-b78ff2439ef7-trusted-ca\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.884450 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k2wrh\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-kube-api-access-k2wrh\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.908506 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985622 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77b99a21-e979-4c30-99b2-b78ff2439ef7-registry-certificates\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985740 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-registry-tls\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985775 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77b99a21-e979-4c30-99b2-b78ff2439ef7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985811 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-bound-sa-token\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985878 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77b99a21-e979-4c30-99b2-b78ff2439ef7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985943 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77b99a21-e979-4c30-99b2-b78ff2439ef7-trusted-ca\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.985986 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k2wrh\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-kube-api-access-k2wrh\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.986909 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/77b99a21-e979-4c30-99b2-b78ff2439ef7-ca-trust-extracted\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.987879 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77b99a21-e979-4c30-99b2-b78ff2439ef7-trusted-ca\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.988036 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/77b99a21-e979-4c30-99b2-b78ff2439ef7-registry-certificates\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:15 crc kubenswrapper[4772]: I0320 11:02:15.993539 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/77b99a21-e979-4c30-99b2-b78ff2439ef7-installation-pull-secrets\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:16 crc kubenswrapper[4772]: I0320 11:02:16.009985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-registry-tls\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:16 crc kubenswrapper[4772]: I0320 11:02:16.015257 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-bound-sa-token\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:16 crc kubenswrapper[4772]: I0320 11:02:16.016989 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k2wrh\" (UniqueName: \"kubernetes.io/projected/77b99a21-e979-4c30-99b2-b78ff2439ef7-kube-api-access-k2wrh\") pod \"image-registry-66df7c8f76-hxrz2\" (UID: \"77b99a21-e979-4c30-99b2-b78ff2439ef7\") " pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:16 crc kubenswrapper[4772]: I0320 11:02:16.159113 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:16 crc kubenswrapper[4772]: I0320 11:02:16.609662 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-hxrz2"] Mar 20 11:02:16 crc kubenswrapper[4772]: I0320 11:02:16.680373 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" event={"ID":"77b99a21-e979-4c30-99b2-b78ff2439ef7","Type":"ContainerStarted","Data":"1b5150c045cc807c55c15a44f25cfb25216827958460b6752a2b4225303ee667"} Mar 20 11:02:17 crc kubenswrapper[4772]: I0320 11:02:17.689669 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" event={"ID":"77b99a21-e979-4c30-99b2-b78ff2439ef7","Type":"ContainerStarted","Data":"5e260c7b13c698ad614f3c9b2f824920aa246b2c2a2b34758987d716da4a7086"} Mar 20 11:02:17 crc kubenswrapper[4772]: I0320 11:02:17.690135 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:17 crc kubenswrapper[4772]: I0320 11:02:17.711753 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" podStartSLOduration=2.711732427 podStartE2EDuration="2.711732427s" podCreationTimestamp="2026-03-20 11:02:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:02:17.710689528 +0000 UTC m=+423.801656053" watchObservedRunningTime="2026-03-20 11:02:17.711732427 +0000 UTC m=+423.802698912" Mar 20 11:02:36 crc kubenswrapper[4772]: I0320 11:02:36.165319 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-hxrz2" Mar 20 11:02:36 crc kubenswrapper[4772]: I0320 11:02:36.221908 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xd664"] Mar 20 11:02:39 crc kubenswrapper[4772]: I0320 11:02:39.564355 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:02:39 crc kubenswrapper[4772]: I0320 11:02:39.564788 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.200053 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pt8p5"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.201264 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-pt8p5" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="registry-server" containerID="cri-o://6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb" gracePeriod=30 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.204581 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8kc55"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.205829 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-8kc55" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="registry-server" containerID="cri-o://8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0" gracePeriod=30 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.230698 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6fhz7"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.231030 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" containerID="cri-o://9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0" gracePeriod=30 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.241251 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr4qj"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.241533 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-pr4qj" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="registry-server" containerID="cri-o://1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713" gracePeriod=30 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.251890 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6l8kp"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.252206 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-6l8kp" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="registry-server" containerID="cri-o://24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a" gracePeriod=30 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.252997 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gshhb"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.253633 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.270972 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gshhb"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.310583 4772 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/community-operators-8kc55" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="registry-server" probeResult="failure" output="" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.313034 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-marketplace/community-operators-8kc55" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="registry-server" probeResult="failure" output="" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.340880 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15bf049-cf58-4f6d-942c-9bdaac82f6df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.340941 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-grbv7\" (UniqueName: \"kubernetes.io/projected/a15bf049-cf58-4f6d-942c-9bdaac82f6df-kube-api-access-grbv7\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.341071 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a15bf049-cf58-4f6d-942c-9bdaac82f6df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.442639 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-grbv7\" (UniqueName: \"kubernetes.io/projected/a15bf049-cf58-4f6d-942c-9bdaac82f6df-kube-api-access-grbv7\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.442720 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a15bf049-cf58-4f6d-942c-9bdaac82f6df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.442772 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15bf049-cf58-4f6d-942c-9bdaac82f6df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.444675 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a15bf049-cf58-4f6d-942c-9bdaac82f6df-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.459645 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/a15bf049-cf58-4f6d-942c-9bdaac82f6df-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.462913 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-grbv7\" (UniqueName: \"kubernetes.io/projected/a15bf049-cf58-4f6d-942c-9bdaac82f6df-kube-api-access-grbv7\") pod \"marketplace-operator-79b997595-gshhb\" (UID: \"a15bf049-cf58-4f6d-942c-9bdaac82f6df\") " pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.653629 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.660013 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.661382 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kc55" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.667330 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.680541 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.696973 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746169 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-utilities\") pod \"22e23182-8e10-42d5-b34d-f09f6f280262\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746224 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r4v48\" (UniqueName: \"kubernetes.io/projected/4eadb8ff-b747-4293-800f-b9894eb72ee3-kube-api-access-r4v48\") pod \"4eadb8ff-b747-4293-800f-b9894eb72ee3\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746254 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-catalog-content\") pod \"4eadb8ff-b747-4293-800f-b9894eb72ee3\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746287 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-catalog-content\") pod \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746307 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x6cg2\" (UniqueName: \"kubernetes.io/projected/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-kube-api-access-x6cg2\") pod \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746329 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-utilities\") pod \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\" (UID: \"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746353 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhlbx\" (UniqueName: \"kubernetes.io/projected/77c87234-b79b-4d2f-8ee3-b14aa050925a-kube-api-access-nhlbx\") pod \"77c87234-b79b-4d2f-8ee3-b14aa050925a\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746372 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-operator-metrics\") pod \"77c87234-b79b-4d2f-8ee3-b14aa050925a\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746397 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-utilities\") pod \"4eadb8ff-b747-4293-800f-b9894eb72ee3\" (UID: \"4eadb8ff-b747-4293-800f-b9894eb72ee3\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746436 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-catalog-content\") pod \"22e23182-8e10-42d5-b34d-f09f6f280262\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746457 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fr9dx\" (UniqueName: \"kubernetes.io/projected/22e23182-8e10-42d5-b34d-f09f6f280262-kube-api-access-fr9dx\") pod \"22e23182-8e10-42d5-b34d-f09f6f280262\" (UID: \"22e23182-8e10-42d5-b34d-f09f6f280262\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.746486 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-trusted-ca\") pod \"77c87234-b79b-4d2f-8ee3-b14aa050925a\" (UID: \"77c87234-b79b-4d2f-8ee3-b14aa050925a\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.748445 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-utilities" (OuterVolumeSpecName: "utilities") pod "22e23182-8e10-42d5-b34d-f09f6f280262" (UID: "22e23182-8e10-42d5-b34d-f09f6f280262"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.749229 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-utilities" (OuterVolumeSpecName: "utilities") pod "4eadb8ff-b747-4293-800f-b9894eb72ee3" (UID: "4eadb8ff-b747-4293-800f-b9894eb72ee3"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.750164 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "77c87234-b79b-4d2f-8ee3-b14aa050925a" (UID: "77c87234-b79b-4d2f-8ee3-b14aa050925a"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.751056 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "77c87234-b79b-4d2f-8ee3-b14aa050925a" (UID: "77c87234-b79b-4d2f-8ee3-b14aa050925a"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.758164 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4eadb8ff-b747-4293-800f-b9894eb72ee3-kube-api-access-r4v48" (OuterVolumeSpecName: "kube-api-access-r4v48") pod "4eadb8ff-b747-4293-800f-b9894eb72ee3" (UID: "4eadb8ff-b747-4293-800f-b9894eb72ee3"). InnerVolumeSpecName "kube-api-access-r4v48". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.761508 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/77c87234-b79b-4d2f-8ee3-b14aa050925a-kube-api-access-nhlbx" (OuterVolumeSpecName: "kube-api-access-nhlbx") pod "77c87234-b79b-4d2f-8ee3-b14aa050925a" (UID: "77c87234-b79b-4d2f-8ee3-b14aa050925a"). InnerVolumeSpecName "kube-api-access-nhlbx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.763138 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22e23182-8e10-42d5-b34d-f09f6f280262-kube-api-access-fr9dx" (OuterVolumeSpecName: "kube-api-access-fr9dx") pod "22e23182-8e10-42d5-b34d-f09f6f280262" (UID: "22e23182-8e10-42d5-b34d-f09f6f280262"). InnerVolumeSpecName "kube-api-access-fr9dx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.763172 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-utilities" (OuterVolumeSpecName: "utilities") pod "8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" (UID: "8db2c4ed-fcb2-48eb-a1a0-1be1d8613260"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.776346 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-kube-api-access-x6cg2" (OuterVolumeSpecName: "kube-api-access-x6cg2") pod "8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" (UID: "8db2c4ed-fcb2-48eb-a1a0-1be1d8613260"). InnerVolumeSpecName "kube-api-access-x6cg2". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.797917 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4eadb8ff-b747-4293-800f-b9894eb72ee3" (UID: "4eadb8ff-b747-4293-800f-b9894eb72ee3"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.813583 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" (UID: "8db2c4ed-fcb2-48eb-a1a0-1be1d8613260"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.830409 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "22e23182-8e10-42d5-b34d-f09f6f280262" (UID: "22e23182-8e10-42d5-b34d-f09f6f280262"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.847965 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ggr6f\" (UniqueName: \"kubernetes.io/projected/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-kube-api-access-ggr6f\") pod \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848120 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-utilities\") pod \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848224 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-catalog-content\") pod \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\" (UID: \"3514d32d-88b3-47e4-b541-6ab2d46a6cfe\") " Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848674 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848693 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fr9dx\" (UniqueName: \"kubernetes.io/projected/22e23182-8e10-42d5-b34d-f09f6f280262-kube-api-access-fr9dx\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848707 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848719 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/22e23182-8e10-42d5-b34d-f09f6f280262-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848730 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r4v48\" (UniqueName: \"kubernetes.io/projected/4eadb8ff-b747-4293-800f-b9894eb72ee3-kube-api-access-r4v48\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848741 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848754 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848765 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x6cg2\" (UniqueName: \"kubernetes.io/projected/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-kube-api-access-x6cg2\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848775 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848786 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhlbx\" (UniqueName: \"kubernetes.io/projected/77c87234-b79b-4d2f-8ee3-b14aa050925a-kube-api-access-nhlbx\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848796 4772 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/77c87234-b79b-4d2f-8ee3-b14aa050925a-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.848809 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4eadb8ff-b747-4293-800f-b9894eb72ee3-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.849665 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-utilities" (OuterVolumeSpecName: "utilities") pod "3514d32d-88b3-47e4-b541-6ab2d46a6cfe" (UID: "3514d32d-88b3-47e4-b541-6ab2d46a6cfe"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.851558 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-kube-api-access-ggr6f" (OuterVolumeSpecName: "kube-api-access-ggr6f") pod "3514d32d-88b3-47e4-b541-6ab2d46a6cfe" (UID: "3514d32d-88b3-47e4-b541-6ab2d46a6cfe"). InnerVolumeSpecName "kube-api-access-ggr6f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.867297 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-gshhb"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.893309 4772 generic.go:334] "Generic (PLEG): container finished" podID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerID="24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a" exitCode=0 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.893455 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerDied","Data":"24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.893648 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-6l8kp" event={"ID":"3514d32d-88b3-47e4-b541-6ab2d46a6cfe","Type":"ContainerDied","Data":"69add3e429d14dbbc50247c1c652f40e380e2e90b279e9569f66784030708d14"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.893618 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-6l8kp" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.893678 4772 scope.go:117] "RemoveContainer" containerID="24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.895607 4772 generic.go:334] "Generic (PLEG): container finished" podID="22e23182-8e10-42d5-b34d-f09f6f280262" containerID="8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0" exitCode=0 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.895664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kc55" event={"ID":"22e23182-8e10-42d5-b34d-f09f6f280262","Type":"ContainerDied","Data":"8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.895696 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-8kc55" event={"ID":"22e23182-8e10-42d5-b34d-f09f6f280262","Type":"ContainerDied","Data":"1ab9a8952b9359270517ab843b7a5a3af7b0c0c5b107b6d51a61939e2bfe6115"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.895741 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-8kc55" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.900126 4772 generic.go:334] "Generic (PLEG): container finished" podID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerID="9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0" exitCode=0 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.900232 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.900361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" event={"ID":"77c87234-b79b-4d2f-8ee3-b14aa050925a","Type":"ContainerDied","Data":"9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.900408 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-6fhz7" event={"ID":"77c87234-b79b-4d2f-8ee3-b14aa050925a","Type":"ContainerDied","Data":"f2f2b410f180f03d7ab26a78482481df27137452b69b0a8549214b4180af066d"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.905202 4772 generic.go:334] "Generic (PLEG): container finished" podID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerID="6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb" exitCode=0 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.905319 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt8p5" event={"ID":"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260","Type":"ContainerDied","Data":"6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.905342 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-pt8p5" event={"ID":"8db2c4ed-fcb2-48eb-a1a0-1be1d8613260","Type":"ContainerDied","Data":"150fb99e7990748158fb20e8f50679d99cdda0612a5f03d3a19aac5027c3dfe8"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.905431 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-pt8p5" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.907426 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" event={"ID":"a15bf049-cf58-4f6d-942c-9bdaac82f6df","Type":"ContainerStarted","Data":"4dd8b75fd654adf926f3295b1cb65c87d2ea8a00406c55608f2872ff5232b257"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.909694 4772 generic.go:334] "Generic (PLEG): container finished" podID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerID="1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713" exitCode=0 Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.909732 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr4qj" event={"ID":"4eadb8ff-b747-4293-800f-b9894eb72ee3","Type":"ContainerDied","Data":"1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.909758 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-pr4qj" event={"ID":"4eadb8ff-b747-4293-800f-b9894eb72ee3","Type":"ContainerDied","Data":"60a5c5c418964f777a5f5a040d631fd21f9a272690ffe1dbd0db82d29a6fe4f9"} Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.909881 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-pr4qj" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.916936 4772 scope.go:117] "RemoveContainer" containerID="54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.931882 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-8kc55"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.935081 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-8kc55"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.954888 4772 scope.go:117] "RemoveContainer" containerID="32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.956063 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ggr6f\" (UniqueName: \"kubernetes.io/projected/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-kube-api-access-ggr6f\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.956253 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.970943 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-pt8p5"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.979141 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-pt8p5"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.983616 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr4qj"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.986858 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-pr4qj"] Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.989741 4772 scope.go:117] "RemoveContainer" containerID="24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a" Mar 20 11:02:49 crc kubenswrapper[4772]: E0320 11:02:49.990219 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a\": container with ID starting with 24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a not found: ID does not exist" containerID="24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990268 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a"} err="failed to get container status \"24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a\": rpc error: code = NotFound desc = could not find container \"24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a\": container with ID starting with 24e95251e74bcb507480e6f441e97cb94594d882b78933864853d0a51901148a not found: ID does not exist" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990301 4772 scope.go:117] "RemoveContainer" containerID="54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990423 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6fhz7"] Mar 20 11:02:49 crc kubenswrapper[4772]: E0320 11:02:49.990617 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71\": container with ID starting with 54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71 not found: ID does not exist" containerID="54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990644 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71"} err="failed to get container status \"54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71\": rpc error: code = NotFound desc = could not find container \"54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71\": container with ID starting with 54ef8906168d8ac2fdca780b9afb4d6067ee846ce34dc97803f85ff9caa8de71 not found: ID does not exist" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990664 4772 scope.go:117] "RemoveContainer" containerID="32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5" Mar 20 11:02:49 crc kubenswrapper[4772]: E0320 11:02:49.990913 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5\": container with ID starting with 32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5 not found: ID does not exist" containerID="32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990943 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5"} err="failed to get container status \"32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5\": rpc error: code = NotFound desc = could not find container \"32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5\": container with ID starting with 32ef743a5c53959f899f742281b0bd30bf42525dcb18e1b764991897b0181ed5 not found: ID does not exist" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.990964 4772 scope.go:117] "RemoveContainer" containerID="8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0" Mar 20 11:02:49 crc kubenswrapper[4772]: I0320 11:02:49.995181 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-6fhz7"] Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.005826 4772 scope.go:117] "RemoveContainer" containerID="237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.020275 4772 scope.go:117] "RemoveContainer" containerID="b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.032973 4772 scope.go:117] "RemoveContainer" containerID="8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.033361 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0\": container with ID starting with 8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0 not found: ID does not exist" containerID="8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.033401 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0"} err="failed to get container status \"8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0\": rpc error: code = NotFound desc = could not find container \"8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0\": container with ID starting with 8ad8f801c647cdce0e4ed3dc7d6bedd1285ea7a3f8b96fd6a11869cfdc6dd8c0 not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.033421 4772 scope.go:117] "RemoveContainer" containerID="237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.033538 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3514d32d-88b3-47e4-b541-6ab2d46a6cfe" (UID: "3514d32d-88b3-47e4-b541-6ab2d46a6cfe"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.033832 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5\": container with ID starting with 237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5 not found: ID does not exist" containerID="237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.033889 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5"} err="failed to get container status \"237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5\": rpc error: code = NotFound desc = could not find container \"237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5\": container with ID starting with 237b36c26cfb1ce5836c98bcca1d49beb6567974e68cc283998a7555792ac1a5 not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.033917 4772 scope.go:117] "RemoveContainer" containerID="b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.034417 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55\": container with ID starting with b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55 not found: ID does not exist" containerID="b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.034444 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55"} err="failed to get container status \"b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55\": rpc error: code = NotFound desc = could not find container \"b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55\": container with ID starting with b9c0ea46943d859c363c21a4973942a4d4549ec717fee5ce381c83dba8f4cf55 not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.034458 4772 scope.go:117] "RemoveContainer" containerID="9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.044744 4772 scope.go:117] "RemoveContainer" containerID="6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.057802 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3514d32d-88b3-47e4-b541-6ab2d46a6cfe-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.058375 4772 scope.go:117] "RemoveContainer" containerID="9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.058871 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0\": container with ID starting with 9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0 not found: ID does not exist" containerID="9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.058907 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0"} err="failed to get container status \"9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0\": rpc error: code = NotFound desc = could not find container \"9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0\": container with ID starting with 9d86be92c52f857ea3f5d5afc2af662292966fa1054f3acaa8db89abd66152e0 not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.058932 4772 scope.go:117] "RemoveContainer" containerID="6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.060104 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba\": container with ID starting with 6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba not found: ID does not exist" containerID="6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.060133 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba"} err="failed to get container status \"6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba\": rpc error: code = NotFound desc = could not find container \"6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba\": container with ID starting with 6e7c70599a4ccafe05d7a7f60ff0e4ab688957e543d39cadd38ed417ec360fba not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.060148 4772 scope.go:117] "RemoveContainer" containerID="6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.078606 4772 scope.go:117] "RemoveContainer" containerID="410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.091632 4772 scope.go:117] "RemoveContainer" containerID="ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.104410 4772 scope.go:117] "RemoveContainer" containerID="6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.104899 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb\": container with ID starting with 6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb not found: ID does not exist" containerID="6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.104928 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb"} err="failed to get container status \"6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb\": rpc error: code = NotFound desc = could not find container \"6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb\": container with ID starting with 6381e8367415d038241d47902724f2426465aa6e1bdb335ef00c6b691f959feb not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.104949 4772 scope.go:117] "RemoveContainer" containerID="410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.105381 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de\": container with ID starting with 410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de not found: ID does not exist" containerID="410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.105426 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de"} err="failed to get container status \"410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de\": rpc error: code = NotFound desc = could not find container \"410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de\": container with ID starting with 410c78cf63d885abae3c4000b16235caf580a2960bcbfe6a8e8ab52675e345de not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.105453 4772 scope.go:117] "RemoveContainer" containerID="ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.105988 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db\": container with ID starting with ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db not found: ID does not exist" containerID="ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.106014 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db"} err="failed to get container status \"ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db\": rpc error: code = NotFound desc = could not find container \"ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db\": container with ID starting with ccecad6519639b697b142db9d0f854677d7ce54e92a81915b8ef686030ec28db not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.106031 4772 scope.go:117] "RemoveContainer" containerID="1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.118687 4772 scope.go:117] "RemoveContainer" containerID="c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.132595 4772 scope.go:117] "RemoveContainer" containerID="8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.147086 4772 scope.go:117] "RemoveContainer" containerID="1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.147600 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713\": container with ID starting with 1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713 not found: ID does not exist" containerID="1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.147791 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713"} err="failed to get container status \"1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713\": rpc error: code = NotFound desc = could not find container \"1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713\": container with ID starting with 1cb6f1a2698cb5c39f9900358ad1b8483b5f7c74199a4697d83430520d748713 not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.147958 4772 scope.go:117] "RemoveContainer" containerID="c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.148419 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b\": container with ID starting with c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b not found: ID does not exist" containerID="c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.148483 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b"} err="failed to get container status \"c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b\": rpc error: code = NotFound desc = could not find container \"c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b\": container with ID starting with c4e7959eb48ed220c995ade9c7ddd7984f1b7339769c2bfa03830bd543ae348b not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.148532 4772 scope.go:117] "RemoveContainer" containerID="8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0" Mar 20 11:02:50 crc kubenswrapper[4772]: E0320 11:02:50.148952 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0\": container with ID starting with 8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0 not found: ID does not exist" containerID="8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.148982 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0"} err="failed to get container status \"8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0\": rpc error: code = NotFound desc = could not find container \"8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0\": container with ID starting with 8c739fcc8dbd9bac6a31fb014d95a929656927a4666b5af0a087bd256e3950b0 not found: ID does not exist" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.223971 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-6l8kp"] Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.231969 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-6l8kp"] Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.647111 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" path="/var/lib/kubelet/pods/22e23182-8e10-42d5-b34d-f09f6f280262/volumes" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.647909 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" path="/var/lib/kubelet/pods/3514d32d-88b3-47e4-b541-6ab2d46a6cfe/volumes" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.648575 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" path="/var/lib/kubelet/pods/4eadb8ff-b747-4293-800f-b9894eb72ee3/volumes" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.649632 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" path="/var/lib/kubelet/pods/77c87234-b79b-4d2f-8ee3-b14aa050925a/volumes" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.650146 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" path="/var/lib/kubelet/pods/8db2c4ed-fcb2-48eb-a1a0-1be1d8613260/volumes" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.923377 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" event={"ID":"a15bf049-cf58-4f6d-942c-9bdaac82f6df","Type":"ContainerStarted","Data":"39932ea7d5fe1a27c3613ef61f522063e5bc04a0d0ee2438805baf5b2b8709c1"} Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.923898 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.927250 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" Mar 20 11:02:50 crc kubenswrapper[4772]: I0320 11:02:50.961262 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-gshhb" podStartSLOduration=1.96124079 podStartE2EDuration="1.96124079s" podCreationTimestamp="2026-03-20 11:02:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:02:50.936428973 +0000 UTC m=+457.027395458" watchObservedRunningTime="2026-03-20 11:02:50.96124079 +0000 UTC m=+457.052207285" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412406 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-rpss5"] Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412679 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412698 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412712 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412724 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412741 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412752 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412767 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412777 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412789 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412799 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412819 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412829 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412867 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412877 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412902 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412915 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412929 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412940 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412955 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412965 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.412980 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.412991 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="extract-utilities" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.413005 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413016 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.413049 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413060 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="extract-content" Mar 20 11:02:51 crc kubenswrapper[4772]: E0320 11:02:51.413075 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413086 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413236 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413263 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3514d32d-88b3-47e4-b541-6ab2d46a6cfe" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413275 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="77c87234-b79b-4d2f-8ee3-b14aa050925a" containerName="marketplace-operator" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413288 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="22e23182-8e10-42d5-b34d-f09f6f280262" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413302 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="4eadb8ff-b747-4293-800f-b9894eb72ee3" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.413315 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db2c4ed-fcb2-48eb-a1a0-1be1d8613260" containerName="registry-server" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.414485 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.425289 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.427034 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpss5"] Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.576544 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-utilities\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.576922 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-catalog-content\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.577087 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4nz4p\" (UniqueName: \"kubernetes.io/projected/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-kube-api-access-4nz4p\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.610981 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-zrgrl"] Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.611880 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.613850 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.622614 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrgrl"] Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.678896 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4nz4p\" (UniqueName: \"kubernetes.io/projected/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-kube-api-access-4nz4p\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.678983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m7r76\" (UniqueName: \"kubernetes.io/projected/ec7931ba-a74f-4de4-9936-4eb143aaadf7-kube-api-access-m7r76\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.679013 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7931ba-a74f-4de4-9936-4eb143aaadf7-utilities\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.679040 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7931ba-a74f-4de4-9936-4eb143aaadf7-catalog-content\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.679069 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-utilities\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.679105 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-catalog-content\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.679599 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-catalog-content\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.679925 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-utilities\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.697857 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4nz4p\" (UniqueName: \"kubernetes.io/projected/b4a24fdf-708a-4252-9889-6d0c68ad8a5a-kube-api-access-4nz4p\") pod \"redhat-marketplace-rpss5\" (UID: \"b4a24fdf-708a-4252-9889-6d0c68ad8a5a\") " pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.733384 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.780611 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m7r76\" (UniqueName: \"kubernetes.io/projected/ec7931ba-a74f-4de4-9936-4eb143aaadf7-kube-api-access-m7r76\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.780662 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7931ba-a74f-4de4-9936-4eb143aaadf7-utilities\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.780688 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7931ba-a74f-4de4-9936-4eb143aaadf7-catalog-content\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.781769 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ec7931ba-a74f-4de4-9936-4eb143aaadf7-utilities\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.783022 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ec7931ba-a74f-4de4-9936-4eb143aaadf7-catalog-content\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.799985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m7r76\" (UniqueName: \"kubernetes.io/projected/ec7931ba-a74f-4de4-9936-4eb143aaadf7-kube-api-access-m7r76\") pod \"certified-operators-zrgrl\" (UID: \"ec7931ba-a74f-4de4-9936-4eb143aaadf7\") " pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.938030 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-rpss5"] Mar 20 11:02:51 crc kubenswrapper[4772]: I0320 11:02:51.947634 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:02:51 crc kubenswrapper[4772]: W0320 11:02:51.948552 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb4a24fdf_708a_4252_9889_6d0c68ad8a5a.slice/crio-2791866bc674d922c61130ad64da1214eae3b70a7180345ca389d2f1c70ded88 WatchSource:0}: Error finding container 2791866bc674d922c61130ad64da1214eae3b70a7180345ca389d2f1c70ded88: Status 404 returned error can't find the container with id 2791866bc674d922c61130ad64da1214eae3b70a7180345ca389d2f1c70ded88 Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.123659 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-zrgrl"] Mar 20 11:02:52 crc kubenswrapper[4772]: W0320 11:02:52.127588 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podec7931ba_a74f_4de4_9936_4eb143aaadf7.slice/crio-9be424413774ed58b69350e2713f0d58d520208089d8af2b622221876bf6b6d5 WatchSource:0}: Error finding container 9be424413774ed58b69350e2713f0d58d520208089d8af2b622221876bf6b6d5: Status 404 returned error can't find the container with id 9be424413774ed58b69350e2713f0d58d520208089d8af2b622221876bf6b6d5 Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.938861 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec7931ba-a74f-4de4-9936-4eb143aaadf7" containerID="4ca03a0dc19151f7a62afe39d8ebec2f0e0f5a355bb633e24ddacbca676e62b6" exitCode=0 Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.938936 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrgrl" event={"ID":"ec7931ba-a74f-4de4-9936-4eb143aaadf7","Type":"ContainerDied","Data":"4ca03a0dc19151f7a62afe39d8ebec2f0e0f5a355bb633e24ddacbca676e62b6"} Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.938967 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrgrl" event={"ID":"ec7931ba-a74f-4de4-9936-4eb143aaadf7","Type":"ContainerStarted","Data":"9be424413774ed58b69350e2713f0d58d520208089d8af2b622221876bf6b6d5"} Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.940525 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4a24fdf-708a-4252-9889-6d0c68ad8a5a" containerID="fabea08678734f9dd8f56050463b13fd2b91a765d7cc84bdb776ab85e882993b" exitCode=0 Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.940579 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpss5" event={"ID":"b4a24fdf-708a-4252-9889-6d0c68ad8a5a","Type":"ContainerDied","Data":"fabea08678734f9dd8f56050463b13fd2b91a765d7cc84bdb776ab85e882993b"} Mar 20 11:02:52 crc kubenswrapper[4772]: I0320 11:02:52.940659 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpss5" event={"ID":"b4a24fdf-708a-4252-9889-6d0c68ad8a5a","Type":"ContainerStarted","Data":"2791866bc674d922c61130ad64da1214eae3b70a7180345ca389d2f1c70ded88"} Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.813649 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rv948"] Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.815068 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.817364 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.826954 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rv948"] Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.924598 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fllxt\" (UniqueName: \"kubernetes.io/projected/8003a3e3-923e-4962-a56c-7499ddb205ba-kube-api-access-fllxt\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.924680 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8003a3e3-923e-4962-a56c-7499ddb205ba-utilities\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.924791 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8003a3e3-923e-4962-a56c-7499ddb205ba-catalog-content\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.947970 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrgrl" event={"ID":"ec7931ba-a74f-4de4-9936-4eb143aaadf7","Type":"ContainerStarted","Data":"69cab2cc0e6c082e9f1323c4fa4573bf68edce2a49788c002cc4fa635308388c"} Mar 20 11:02:53 crc kubenswrapper[4772]: I0320 11:02:53.950215 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpss5" event={"ID":"b4a24fdf-708a-4252-9889-6d0c68ad8a5a","Type":"ContainerStarted","Data":"08510baf4aa8025381aa02a1841915c9aba91349172dad7854760081c4a6edc6"} Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.008668 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-l6262"] Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.010052 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.013417 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.021213 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6262"] Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.026071 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8003a3e3-923e-4962-a56c-7499ddb205ba-catalog-content\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.026158 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fllxt\" (UniqueName: \"kubernetes.io/projected/8003a3e3-923e-4962-a56c-7499ddb205ba-kube-api-access-fllxt\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.026191 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8003a3e3-923e-4962-a56c-7499ddb205ba-utilities\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.026661 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/8003a3e3-923e-4962-a56c-7499ddb205ba-utilities\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.026714 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/8003a3e3-923e-4962-a56c-7499ddb205ba-catalog-content\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.053652 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fllxt\" (UniqueName: \"kubernetes.io/projected/8003a3e3-923e-4962-a56c-7499ddb205ba-kube-api-access-fllxt\") pod \"redhat-operators-rv948\" (UID: \"8003a3e3-923e-4962-a56c-7499ddb205ba\") " pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.128401 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-catalog-content\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.128533 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-utilities\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.128573 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd42h\" (UniqueName: \"kubernetes.io/projected/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-kube-api-access-sd42h\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.133197 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.230074 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-catalog-content\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.230173 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-utilities\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.230233 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sd42h\" (UniqueName: \"kubernetes.io/projected/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-kube-api-access-sd42h\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.233431 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-catalog-content\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.233994 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-utilities\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.251225 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd42h\" (UniqueName: \"kubernetes.io/projected/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-kube-api-access-sd42h\") pod \"community-operators-l6262\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.325634 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.957337 4772 generic.go:334] "Generic (PLEG): container finished" podID="ec7931ba-a74f-4de4-9936-4eb143aaadf7" containerID="69cab2cc0e6c082e9f1323c4fa4573bf68edce2a49788c002cc4fa635308388c" exitCode=0 Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.957403 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrgrl" event={"ID":"ec7931ba-a74f-4de4-9936-4eb143aaadf7","Type":"ContainerDied","Data":"69cab2cc0e6c082e9f1323c4fa4573bf68edce2a49788c002cc4fa635308388c"} Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.962606 4772 generic.go:334] "Generic (PLEG): container finished" podID="b4a24fdf-708a-4252-9889-6d0c68ad8a5a" containerID="08510baf4aa8025381aa02a1841915c9aba91349172dad7854760081c4a6edc6" exitCode=0 Mar 20 11:02:54 crc kubenswrapper[4772]: I0320 11:02:54.962651 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpss5" event={"ID":"b4a24fdf-708a-4252-9889-6d0c68ad8a5a","Type":"ContainerDied","Data":"08510baf4aa8025381aa02a1841915c9aba91349172dad7854760081c4a6edc6"} Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.145938 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-l6262"] Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.147163 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rv948"] Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.973483 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerID="b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367" exitCode=0 Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.973565 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerDied","Data":"b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367"} Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.974050 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerStarted","Data":"f764be5139eada7d786c76f5a5741ffe9d46bb32c5cdca5cb209975d5ff9e087"} Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.975554 4772 generic.go:334] "Generic (PLEG): container finished" podID="8003a3e3-923e-4962-a56c-7499ddb205ba" containerID="e4488d3be4dc0a36bea1c8059624589f1e9578fc14a3967b70f4d5cab2eaa49c" exitCode=0 Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.975799 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rv948" event={"ID":"8003a3e3-923e-4962-a56c-7499ddb205ba","Type":"ContainerDied","Data":"e4488d3be4dc0a36bea1c8059624589f1e9578fc14a3967b70f4d5cab2eaa49c"} Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.975864 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rv948" event={"ID":"8003a3e3-923e-4962-a56c-7499ddb205ba","Type":"ContainerStarted","Data":"43ecafbf6b30ffb0bb1b17dc2814089703c23cbacac379f4359189125202488a"} Mar 20 11:02:55 crc kubenswrapper[4772]: I0320 11:02:55.978299 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-rpss5" event={"ID":"b4a24fdf-708a-4252-9889-6d0c68ad8a5a","Type":"ContainerStarted","Data":"83d69e5bcd92df2bf534fb9357b45b5c512ecb3149ed24dccdc5e4aaaa357603"} Mar 20 11:02:56 crc kubenswrapper[4772]: I0320 11:02:56.033664 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-rpss5" podStartSLOduration=2.164401723 podStartE2EDuration="5.033644328s" podCreationTimestamp="2026-03-20 11:02:51 +0000 UTC" firstStartedPulling="2026-03-20 11:02:52.942058189 +0000 UTC m=+459.033024674" lastFinishedPulling="2026-03-20 11:02:55.811300794 +0000 UTC m=+461.902267279" observedRunningTime="2026-03-20 11:02:56.031007874 +0000 UTC m=+462.121974359" watchObservedRunningTime="2026-03-20 11:02:56.033644328 +0000 UTC m=+462.124610813" Mar 20 11:02:56 crc kubenswrapper[4772]: I0320 11:02:56.991237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-zrgrl" event={"ID":"ec7931ba-a74f-4de4-9936-4eb143aaadf7","Type":"ContainerStarted","Data":"4acabda6e1cbe9757d5b545547af2fc4c071c5e69d037c306b1bfd2cc444dc53"} Mar 20 11:02:56 crc kubenswrapper[4772]: I0320 11:02:56.993987 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerStarted","Data":"e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26"} Mar 20 11:02:57 crc kubenswrapper[4772]: I0320 11:02:57.018300 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-zrgrl" podStartSLOduration=3.002636409 podStartE2EDuration="6.018278905s" podCreationTimestamp="2026-03-20 11:02:51 +0000 UTC" firstStartedPulling="2026-03-20 11:02:52.940392172 +0000 UTC m=+459.031358657" lastFinishedPulling="2026-03-20 11:02:55.956034668 +0000 UTC m=+462.047001153" observedRunningTime="2026-03-20 11:02:57.014739826 +0000 UTC m=+463.105706321" watchObservedRunningTime="2026-03-20 11:02:57.018278905 +0000 UTC m=+463.109245390" Mar 20 11:02:58 crc kubenswrapper[4772]: I0320 11:02:58.002345 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerID="e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26" exitCode=0 Mar 20 11:02:58 crc kubenswrapper[4772]: I0320 11:02:58.002408 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerDied","Data":"e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26"} Mar 20 11:02:58 crc kubenswrapper[4772]: I0320 11:02:58.022674 4772 generic.go:334] "Generic (PLEG): container finished" podID="8003a3e3-923e-4962-a56c-7499ddb205ba" containerID="e94f69210fbcab679e0bf46f16f7162da4eb31c67594012023d76c62d93b7f55" exitCode=0 Mar 20 11:02:58 crc kubenswrapper[4772]: I0320 11:02:58.023607 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rv948" event={"ID":"8003a3e3-923e-4962-a56c-7499ddb205ba","Type":"ContainerDied","Data":"e94f69210fbcab679e0bf46f16f7162da4eb31c67594012023d76c62d93b7f55"} Mar 20 11:02:59 crc kubenswrapper[4772]: I0320 11:02:59.029020 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerStarted","Data":"d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5"} Mar 20 11:02:59 crc kubenswrapper[4772]: I0320 11:02:59.032077 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rv948" event={"ID":"8003a3e3-923e-4962-a56c-7499ddb205ba","Type":"ContainerStarted","Data":"3076d442ccaad715c2f5893f7c7649b172ec332a781ba619b4a812266d8b4b36"} Mar 20 11:02:59 crc kubenswrapper[4772]: I0320 11:02:59.051898 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-l6262" podStartSLOduration=3.593031995 podStartE2EDuration="6.051879866s" podCreationTimestamp="2026-03-20 11:02:53 +0000 UTC" firstStartedPulling="2026-03-20 11:02:55.975462764 +0000 UTC m=+462.066429259" lastFinishedPulling="2026-03-20 11:02:58.434310645 +0000 UTC m=+464.525277130" observedRunningTime="2026-03-20 11:02:59.05062058 +0000 UTC m=+465.141587075" watchObservedRunningTime="2026-03-20 11:02:59.051879866 +0000 UTC m=+465.142846371" Mar 20 11:02:59 crc kubenswrapper[4772]: I0320 11:02:59.066729 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rv948" podStartSLOduration=3.498794507 podStartE2EDuration="6.066711992s" podCreationTimestamp="2026-03-20 11:02:53 +0000 UTC" firstStartedPulling="2026-03-20 11:02:55.976977746 +0000 UTC m=+462.067944251" lastFinishedPulling="2026-03-20 11:02:58.544895241 +0000 UTC m=+464.635861736" observedRunningTime="2026-03-20 11:02:59.064753488 +0000 UTC m=+465.155719973" watchObservedRunningTime="2026-03-20 11:02:59.066711992 +0000 UTC m=+465.157678467" Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.266507 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" podUID="15ca766d-44d0-4433-b2f8-6348e66ee047" containerName="registry" containerID="cri-o://3b3b86a309e5e6ddc2d6ddf779075cb856ec9bdaef8470f2db27bbeaf080fcc3" gracePeriod=30 Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.733613 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.733688 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.772737 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.948167 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.948457 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:03:01 crc kubenswrapper[4772]: I0320 11:03:01.988443 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.050278 4772 generic.go:334] "Generic (PLEG): container finished" podID="15ca766d-44d0-4433-b2f8-6348e66ee047" containerID="3b3b86a309e5e6ddc2d6ddf779075cb856ec9bdaef8470f2db27bbeaf080fcc3" exitCode=0 Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.050390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" event={"ID":"15ca766d-44d0-4433-b2f8-6348e66ee047","Type":"ContainerDied","Data":"3b3b86a309e5e6ddc2d6ddf779075cb856ec9bdaef8470f2db27bbeaf080fcc3"} Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.087317 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-zrgrl" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.090413 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-rpss5" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.118185 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236282 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-certificates\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236331 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-trusted-ca\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236358 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-tls\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236382 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca766d-44d0-4433-b2f8-6348e66ee047-installation-pull-secrets\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236409 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-bound-sa-token\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236465 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca766d-44d0-4433-b2f8-6348e66ee047-ca-trust-extracted\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236506 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jtfzm\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-kube-api-access-jtfzm\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.236599 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"15ca766d-44d0-4433-b2f8-6348e66ee047\" (UID: \"15ca766d-44d0-4433-b2f8-6348e66ee047\") " Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.237154 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.237201 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.242033 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.242613 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.248016 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-kube-api-access-jtfzm" (OuterVolumeSpecName: "kube-api-access-jtfzm") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "kube-api-access-jtfzm". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.248025 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15ca766d-44d0-4433-b2f8-6348e66ee047-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.253059 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.254851 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15ca766d-44d0-4433-b2f8-6348e66ee047-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "15ca766d-44d0-4433-b2f8-6348e66ee047" (UID: "15ca766d-44d0-4433-b2f8-6348e66ee047"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338089 4772 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/15ca766d-44d0-4433-b2f8-6348e66ee047-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338126 4772 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-bound-sa-token\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338137 4772 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/15ca766d-44d0-4433-b2f8-6348e66ee047-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338145 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jtfzm\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-kube-api-access-jtfzm\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338153 4772 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-certificates\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338162 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/15ca766d-44d0-4433-b2f8-6348e66ee047-trusted-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:02 crc kubenswrapper[4772]: I0320 11:03:02.338170 4772 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/15ca766d-44d0-4433-b2f8-6348e66ee047-registry-tls\") on node \"crc\" DevicePath \"\"" Mar 20 11:03:03 crc kubenswrapper[4772]: I0320 11:03:03.059418 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" Mar 20 11:03:03 crc kubenswrapper[4772]: I0320 11:03:03.059414 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-xd664" event={"ID":"15ca766d-44d0-4433-b2f8-6348e66ee047","Type":"ContainerDied","Data":"bd3ade33c5717b92a02e0632ca0d83cfb3c1fc7498d5c0fa39f76c710696bfb0"} Mar 20 11:03:03 crc kubenswrapper[4772]: I0320 11:03:03.059630 4772 scope.go:117] "RemoveContainer" containerID="3b3b86a309e5e6ddc2d6ddf779075cb856ec9bdaef8470f2db27bbeaf080fcc3" Mar 20 11:03:03 crc kubenswrapper[4772]: I0320 11:03:03.090522 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xd664"] Mar 20 11:03:03 crc kubenswrapper[4772]: I0320 11:03:03.096224 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-xd664"] Mar 20 11:03:04 crc kubenswrapper[4772]: I0320 11:03:04.135283 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:03:04 crc kubenswrapper[4772]: I0320 11:03:04.138881 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:03:04 crc kubenswrapper[4772]: I0320 11:03:04.326745 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:03:04 crc kubenswrapper[4772]: I0320 11:03:04.326825 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:03:04 crc kubenswrapper[4772]: I0320 11:03:04.391559 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:03:04 crc kubenswrapper[4772]: I0320 11:03:04.648902 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15ca766d-44d0-4433-b2f8-6348e66ee047" path="/var/lib/kubelet/pods/15ca766d-44d0-4433-b2f8-6348e66ee047/volumes" Mar 20 11:03:05 crc kubenswrapper[4772]: I0320 11:03:05.108315 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:03:05 crc kubenswrapper[4772]: I0320 11:03:05.173659 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-rv948" podUID="8003a3e3-923e-4962-a56c-7499ddb205ba" containerName="registry-server" probeResult="failure" output=< Mar 20 11:03:05 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Mar 20 11:03:05 crc kubenswrapper[4772]: > Mar 20 11:03:09 crc kubenswrapper[4772]: I0320 11:03:09.564367 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:03:09 crc kubenswrapper[4772]: I0320 11:03:09.564804 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:03:09 crc kubenswrapper[4772]: I0320 11:03:09.564888 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:03:09 crc kubenswrapper[4772]: I0320 11:03:09.565673 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c7d6b41b9d4ea0c87e4e55d0473f2ee78f694c9c978233327bfd7e8f2cecafdc"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:03:09 crc kubenswrapper[4772]: I0320 11:03:09.565737 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://c7d6b41b9d4ea0c87e4e55d0473f2ee78f694c9c978233327bfd7e8f2cecafdc" gracePeriod=600 Mar 20 11:03:10 crc kubenswrapper[4772]: I0320 11:03:10.098507 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="c7d6b41b9d4ea0c87e4e55d0473f2ee78f694c9c978233327bfd7e8f2cecafdc" exitCode=0 Mar 20 11:03:10 crc kubenswrapper[4772]: I0320 11:03:10.098580 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"c7d6b41b9d4ea0c87e4e55d0473f2ee78f694c9c978233327bfd7e8f2cecafdc"} Mar 20 11:03:10 crc kubenswrapper[4772]: I0320 11:03:10.098650 4772 scope.go:117] "RemoveContainer" containerID="b8877dda324b6beb71c4399356ed4adbc166737ed2a4e5779392a95ee6bd091e" Mar 20 11:03:11 crc kubenswrapper[4772]: I0320 11:03:11.107154 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"2a850ae2fa64972c823328c8fe9588b84861e2acf1ca840a7222809a1aa0c1ff"} Mar 20 11:03:14 crc kubenswrapper[4772]: I0320 11:03:14.189088 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:03:14 crc kubenswrapper[4772]: I0320 11:03:14.233271 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rv948" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.131698 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566744-pn8jx"] Mar 20 11:04:00 crc kubenswrapper[4772]: E0320 11:04:00.133771 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="15ca766d-44d0-4433-b2f8-6348e66ee047" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.133921 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="15ca766d-44d0-4433-b2f8-6348e66ee047" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.134161 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="15ca766d-44d0-4433-b2f8-6348e66ee047" containerName="registry" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.134710 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.136800 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.136917 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-pn8jx"] Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.137075 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.138670 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.235112 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z47rg\" (UniqueName: \"kubernetes.io/projected/a1b29235-5568-47d2-b736-a9b91dfbab0b-kube-api-access-z47rg\") pod \"auto-csr-approver-29566744-pn8jx\" (UID: \"a1b29235-5568-47d2-b736-a9b91dfbab0b\") " pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.336216 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z47rg\" (UniqueName: \"kubernetes.io/projected/a1b29235-5568-47d2-b736-a9b91dfbab0b-kube-api-access-z47rg\") pod \"auto-csr-approver-29566744-pn8jx\" (UID: \"a1b29235-5568-47d2-b736-a9b91dfbab0b\") " pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.356746 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z47rg\" (UniqueName: \"kubernetes.io/projected/a1b29235-5568-47d2-b736-a9b91dfbab0b-kube-api-access-z47rg\") pod \"auto-csr-approver-29566744-pn8jx\" (UID: \"a1b29235-5568-47d2-b736-a9b91dfbab0b\") " pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.452614 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.622324 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-pn8jx"] Mar 20 11:04:00 crc kubenswrapper[4772]: I0320 11:04:00.629860 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:04:01 crc kubenswrapper[4772]: I0320 11:04:01.404759 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" event={"ID":"a1b29235-5568-47d2-b736-a9b91dfbab0b","Type":"ContainerStarted","Data":"c79e074c45ea7364f3d209dcbd491b9142c9223ff90bb1ee36b85af8d94e1471"} Mar 20 11:04:02 crc kubenswrapper[4772]: I0320 11:04:02.410956 4772 generic.go:334] "Generic (PLEG): container finished" podID="a1b29235-5568-47d2-b736-a9b91dfbab0b" containerID="06f0d4f436a0ff2ad03b0761fb8607acf4074b20753529a92f495c258e1e122a" exitCode=0 Mar 20 11:04:02 crc kubenswrapper[4772]: I0320 11:04:02.410997 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" event={"ID":"a1b29235-5568-47d2-b736-a9b91dfbab0b","Type":"ContainerDied","Data":"06f0d4f436a0ff2ad03b0761fb8607acf4074b20753529a92f495c258e1e122a"} Mar 20 11:04:03 crc kubenswrapper[4772]: I0320 11:04:03.654139 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:03 crc kubenswrapper[4772]: I0320 11:04:03.778958 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z47rg\" (UniqueName: \"kubernetes.io/projected/a1b29235-5568-47d2-b736-a9b91dfbab0b-kube-api-access-z47rg\") pod \"a1b29235-5568-47d2-b736-a9b91dfbab0b\" (UID: \"a1b29235-5568-47d2-b736-a9b91dfbab0b\") " Mar 20 11:04:03 crc kubenswrapper[4772]: I0320 11:04:03.783980 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a1b29235-5568-47d2-b736-a9b91dfbab0b-kube-api-access-z47rg" (OuterVolumeSpecName: "kube-api-access-z47rg") pod "a1b29235-5568-47d2-b736-a9b91dfbab0b" (UID: "a1b29235-5568-47d2-b736-a9b91dfbab0b"). InnerVolumeSpecName "kube-api-access-z47rg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:04:03 crc kubenswrapper[4772]: I0320 11:04:03.880131 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z47rg\" (UniqueName: \"kubernetes.io/projected/a1b29235-5568-47d2-b736-a9b91dfbab0b-kube-api-access-z47rg\") on node \"crc\" DevicePath \"\"" Mar 20 11:04:04 crc kubenswrapper[4772]: I0320 11:04:04.421415 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" event={"ID":"a1b29235-5568-47d2-b736-a9b91dfbab0b","Type":"ContainerDied","Data":"c79e074c45ea7364f3d209dcbd491b9142c9223ff90bb1ee36b85af8d94e1471"} Mar 20 11:04:04 crc kubenswrapper[4772]: I0320 11:04:04.421455 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c79e074c45ea7364f3d209dcbd491b9142c9223ff90bb1ee36b85af8d94e1471" Mar 20 11:04:04 crc kubenswrapper[4772]: I0320 11:04:04.421476 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566744-pn8jx" Mar 20 11:04:04 crc kubenswrapper[4772]: I0320 11:04:04.710822 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-zlqcq"] Mar 20 11:04:04 crc kubenswrapper[4772]: I0320 11:04:04.713915 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566738-zlqcq"] Mar 20 11:04:06 crc kubenswrapper[4772]: I0320 11:04:06.648240 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9309f110-5a80-46ca-b3de-8087048c13e2" path="/var/lib/kubelet/pods/9309f110-5a80-46ca-b3de-8087048c13e2/volumes" Mar 20 11:05:39 crc kubenswrapper[4772]: I0320 11:05:39.564976 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:05:39 crc kubenswrapper[4772]: I0320 11:05:39.565680 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.132948 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566746-znz9n"] Mar 20 11:06:00 crc kubenswrapper[4772]: E0320 11:06:00.133659 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a1b29235-5568-47d2-b736-a9b91dfbab0b" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.133675 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a1b29235-5568-47d2-b736-a9b91dfbab0b" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.133787 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a1b29235-5568-47d2-b736-a9b91dfbab0b" containerName="oc" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.134216 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.137113 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.137217 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.137240 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.159017 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-znz9n"] Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.279202 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8hj4t\" (UniqueName: \"kubernetes.io/projected/5a30b4d8-69d6-4376-a53a-9011a7e681d9-kube-api-access-8hj4t\") pod \"auto-csr-approver-29566746-znz9n\" (UID: \"5a30b4d8-69d6-4376-a53a-9011a7e681d9\") " pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.403816 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8hj4t\" (UniqueName: \"kubernetes.io/projected/5a30b4d8-69d6-4376-a53a-9011a7e681d9-kube-api-access-8hj4t\") pod \"auto-csr-approver-29566746-znz9n\" (UID: \"5a30b4d8-69d6-4376-a53a-9011a7e681d9\") " pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.434630 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8hj4t\" (UniqueName: \"kubernetes.io/projected/5a30b4d8-69d6-4376-a53a-9011a7e681d9-kube-api-access-8hj4t\") pod \"auto-csr-approver-29566746-znz9n\" (UID: \"5a30b4d8-69d6-4376-a53a-9011a7e681d9\") " pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.457434 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:00 crc kubenswrapper[4772]: I0320 11:06:00.653463 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-znz9n"] Mar 20 11:06:01 crc kubenswrapper[4772]: I0320 11:06:01.085425 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-znz9n" event={"ID":"5a30b4d8-69d6-4376-a53a-9011a7e681d9","Type":"ContainerStarted","Data":"b8500d67d6e44fd56dc39ab207676c03e5eb3135f95093e978b228d324322fe7"} Mar 20 11:06:03 crc kubenswrapper[4772]: I0320 11:06:03.096393 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-znz9n" event={"ID":"5a30b4d8-69d6-4376-a53a-9011a7e681d9","Type":"ContainerStarted","Data":"42beb27ed312c971ba6b1e765200b7fff652794381276c3378000aa392f8c5f9"} Mar 20 11:06:04 crc kubenswrapper[4772]: I0320 11:06:04.102163 4772 generic.go:334] "Generic (PLEG): container finished" podID="5a30b4d8-69d6-4376-a53a-9011a7e681d9" containerID="42beb27ed312c971ba6b1e765200b7fff652794381276c3378000aa392f8c5f9" exitCode=0 Mar 20 11:06:04 crc kubenswrapper[4772]: I0320 11:06:04.103138 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-znz9n" event={"ID":"5a30b4d8-69d6-4376-a53a-9011a7e681d9","Type":"ContainerDied","Data":"42beb27ed312c971ba6b1e765200b7fff652794381276c3378000aa392f8c5f9"} Mar 20 11:06:05 crc kubenswrapper[4772]: I0320 11:06:05.317469 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:05 crc kubenswrapper[4772]: I0320 11:06:05.464224 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8hj4t\" (UniqueName: \"kubernetes.io/projected/5a30b4d8-69d6-4376-a53a-9011a7e681d9-kube-api-access-8hj4t\") pod \"5a30b4d8-69d6-4376-a53a-9011a7e681d9\" (UID: \"5a30b4d8-69d6-4376-a53a-9011a7e681d9\") " Mar 20 11:06:05 crc kubenswrapper[4772]: I0320 11:06:05.470537 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5a30b4d8-69d6-4376-a53a-9011a7e681d9-kube-api-access-8hj4t" (OuterVolumeSpecName: "kube-api-access-8hj4t") pod "5a30b4d8-69d6-4376-a53a-9011a7e681d9" (UID: "5a30b4d8-69d6-4376-a53a-9011a7e681d9"). InnerVolumeSpecName "kube-api-access-8hj4t". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:06:05 crc kubenswrapper[4772]: I0320 11:06:05.565626 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8hj4t\" (UniqueName: \"kubernetes.io/projected/5a30b4d8-69d6-4376-a53a-9011a7e681d9-kube-api-access-8hj4t\") on node \"crc\" DevicePath \"\"" Mar 20 11:06:06 crc kubenswrapper[4772]: I0320 11:06:06.114544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566746-znz9n" event={"ID":"5a30b4d8-69d6-4376-a53a-9011a7e681d9","Type":"ContainerDied","Data":"b8500d67d6e44fd56dc39ab207676c03e5eb3135f95093e978b228d324322fe7"} Mar 20 11:06:06 crc kubenswrapper[4772]: I0320 11:06:06.114589 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8500d67d6e44fd56dc39ab207676c03e5eb3135f95093e978b228d324322fe7" Mar 20 11:06:06 crc kubenswrapper[4772]: I0320 11:06:06.114619 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566746-znz9n" Mar 20 11:06:06 crc kubenswrapper[4772]: I0320 11:06:06.160477 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-v69rd"] Mar 20 11:06:06 crc kubenswrapper[4772]: I0320 11:06:06.164648 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566740-v69rd"] Mar 20 11:06:06 crc kubenswrapper[4772]: I0320 11:06:06.648896 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd765c99-6471-4286-9dfe-c647dd180320" path="/var/lib/kubelet/pods/fd765c99-6471-4286-9dfe-c647dd180320/volumes" Mar 20 11:06:09 crc kubenswrapper[4772]: I0320 11:06:09.564986 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:06:09 crc kubenswrapper[4772]: I0320 11:06:09.565337 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:06:28 crc kubenswrapper[4772]: I0320 11:06:28.272033 4772 scope.go:117] "RemoveContainer" containerID="7b724749b84b64887424fbe66d813cfe02190b9d27467daf6b4261992ae80790" Mar 20 11:06:28 crc kubenswrapper[4772]: I0320 11:06:28.309149 4772 scope.go:117] "RemoveContainer" containerID="0b1b0f547474a86220ba01b4225bdb9e4b8d5cc0ed8f6fb441e918b9447359dc" Mar 20 11:06:39 crc kubenswrapper[4772]: I0320 11:06:39.564565 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:06:39 crc kubenswrapper[4772]: I0320 11:06:39.565109 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:06:39 crc kubenswrapper[4772]: I0320 11:06:39.565156 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:06:39 crc kubenswrapper[4772]: I0320 11:06:39.565686 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"2a850ae2fa64972c823328c8fe9588b84861e2acf1ca840a7222809a1aa0c1ff"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:06:39 crc kubenswrapper[4772]: I0320 11:06:39.565746 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://2a850ae2fa64972c823328c8fe9588b84861e2acf1ca840a7222809a1aa0c1ff" gracePeriod=600 Mar 20 11:06:40 crc kubenswrapper[4772]: I0320 11:06:40.291364 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="2a850ae2fa64972c823328c8fe9588b84861e2acf1ca840a7222809a1aa0c1ff" exitCode=0 Mar 20 11:06:40 crc kubenswrapper[4772]: I0320 11:06:40.291419 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"2a850ae2fa64972c823328c8fe9588b84861e2acf1ca840a7222809a1aa0c1ff"} Mar 20 11:06:40 crc kubenswrapper[4772]: I0320 11:06:40.292230 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"c031238f25d43745bddff1c50d95ad51119ab5adc3b084d1a2d3a9cfa70802a1"} Mar 20 11:06:40 crc kubenswrapper[4772]: I0320 11:06:40.292261 4772 scope.go:117] "RemoveContainer" containerID="c7d6b41b9d4ea0c87e4e55d0473f2ee78f694c9c978233327bfd7e8f2cecafdc" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.134380 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566748-dzlb9"] Mar 20 11:08:00 crc kubenswrapper[4772]: E0320 11:08:00.135158 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5a30b4d8-69d6-4376-a53a-9011a7e681d9" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.135170 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5a30b4d8-69d6-4376-a53a-9011a7e681d9" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.135262 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5a30b4d8-69d6-4376-a53a-9011a7e681d9" containerName="oc" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.135632 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.138951 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.139165 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.138953 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.141412 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-dzlb9"] Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.238705 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p99rz\" (UniqueName: \"kubernetes.io/projected/3bb2e471-12ed-447a-87ec-551878c46fea-kube-api-access-p99rz\") pod \"auto-csr-approver-29566748-dzlb9\" (UID: \"3bb2e471-12ed-447a-87ec-551878c46fea\") " pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.340113 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p99rz\" (UniqueName: \"kubernetes.io/projected/3bb2e471-12ed-447a-87ec-551878c46fea-kube-api-access-p99rz\") pod \"auto-csr-approver-29566748-dzlb9\" (UID: \"3bb2e471-12ed-447a-87ec-551878c46fea\") " pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.360518 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p99rz\" (UniqueName: \"kubernetes.io/projected/3bb2e471-12ed-447a-87ec-551878c46fea-kube-api-access-p99rz\") pod \"auto-csr-approver-29566748-dzlb9\" (UID: \"3bb2e471-12ed-447a-87ec-551878c46fea\") " pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.462741 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.685335 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-dzlb9"] Mar 20 11:08:00 crc kubenswrapper[4772]: I0320 11:08:00.794029 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" event={"ID":"3bb2e471-12ed-447a-87ec-551878c46fea","Type":"ContainerStarted","Data":"73dce03c25a4436f0426f123543b2a998df576a195c61c7e81c992da0798552d"} Mar 20 11:08:02 crc kubenswrapper[4772]: I0320 11:08:02.804307 4772 generic.go:334] "Generic (PLEG): container finished" podID="3bb2e471-12ed-447a-87ec-551878c46fea" containerID="084311825a5bcf13e2fa31bc1f33e2dae25a1b53f78a0f645b4ace1d4f9c250f" exitCode=0 Mar 20 11:08:02 crc kubenswrapper[4772]: I0320 11:08:02.804361 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" event={"ID":"3bb2e471-12ed-447a-87ec-551878c46fea","Type":"ContainerDied","Data":"084311825a5bcf13e2fa31bc1f33e2dae25a1b53f78a0f645b4ace1d4f9c250f"} Mar 20 11:08:03 crc kubenswrapper[4772]: I0320 11:08:03.997675 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:04 crc kubenswrapper[4772]: I0320 11:08:04.186595 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p99rz\" (UniqueName: \"kubernetes.io/projected/3bb2e471-12ed-447a-87ec-551878c46fea-kube-api-access-p99rz\") pod \"3bb2e471-12ed-447a-87ec-551878c46fea\" (UID: \"3bb2e471-12ed-447a-87ec-551878c46fea\") " Mar 20 11:08:04 crc kubenswrapper[4772]: I0320 11:08:04.202730 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3bb2e471-12ed-447a-87ec-551878c46fea-kube-api-access-p99rz" (OuterVolumeSpecName: "kube-api-access-p99rz") pod "3bb2e471-12ed-447a-87ec-551878c46fea" (UID: "3bb2e471-12ed-447a-87ec-551878c46fea"). InnerVolumeSpecName "kube-api-access-p99rz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:04 crc kubenswrapper[4772]: I0320 11:08:04.288055 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p99rz\" (UniqueName: \"kubernetes.io/projected/3bb2e471-12ed-447a-87ec-551878c46fea-kube-api-access-p99rz\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:04 crc kubenswrapper[4772]: I0320 11:08:04.816159 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" event={"ID":"3bb2e471-12ed-447a-87ec-551878c46fea","Type":"ContainerDied","Data":"73dce03c25a4436f0426f123543b2a998df576a195c61c7e81c992da0798552d"} Mar 20 11:08:04 crc kubenswrapper[4772]: I0320 11:08:04.816200 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="73dce03c25a4436f0426f123543b2a998df576a195c61c7e81c992da0798552d" Mar 20 11:08:04 crc kubenswrapper[4772]: I0320 11:08:04.816212 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566748-dzlb9" Mar 20 11:08:05 crc kubenswrapper[4772]: I0320 11:08:05.058386 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-fl9lv"] Mar 20 11:08:05 crc kubenswrapper[4772]: I0320 11:08:05.063807 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566742-fl9lv"] Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.649054 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5590a77f-c988-4799-8fa8-ffb41d9153f2" path="/var/lib/kubelet/pods/5590a77f-c988-4799-8fa8-ffb41d9153f2/volumes" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.704273 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm"] Mar 20 11:08:06 crc kubenswrapper[4772]: E0320 11:08:06.704558 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3bb2e471-12ed-447a-87ec-551878c46fea" containerName="oc" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.704581 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3bb2e471-12ed-447a-87ec-551878c46fea" containerName="oc" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.704725 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3bb2e471-12ed-447a-87ec-551878c46fea" containerName="oc" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.705244 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.707668 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-cainjector-dockercfg-45sc9" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.707787 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"openshift-service-ca.crt" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.709020 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"cert-manager"/"kube-root-ca.crt" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.709278 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-858654f9db-c2s9c"] Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.710145 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c2s9c" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.712366 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-dockercfg-qxdj5" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.728993 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm"] Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.756011 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rgz26"] Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.756909 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.759362 4772 reflector.go:368] Caches populated for *v1.Secret from object-"cert-manager"/"cert-manager-webhook-dockercfg-p2p75" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.775867 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c2s9c"] Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.782595 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rgz26"] Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.818702 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-279n9\" (UniqueName: \"kubernetes.io/projected/693f7934-75ca-41bc-9bc1-20f7b9da436e-kube-api-access-279n9\") pod \"cert-manager-cainjector-cf98fcc89-b4mlm\" (UID: \"693f7934-75ca-41bc-9bc1-20f7b9da436e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.819184 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ms7b4\" (UniqueName: \"kubernetes.io/projected/e0cd75e7-19e1-431d-a863-c4ed52878e91-kube-api-access-ms7b4\") pod \"cert-manager-858654f9db-c2s9c\" (UID: \"e0cd75e7-19e1-431d-a863-c4ed52878e91\") " pod="cert-manager/cert-manager-858654f9db-c2s9c" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.920289 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-279n9\" (UniqueName: \"kubernetes.io/projected/693f7934-75ca-41bc-9bc1-20f7b9da436e-kube-api-access-279n9\") pod \"cert-manager-cainjector-cf98fcc89-b4mlm\" (UID: \"693f7934-75ca-41bc-9bc1-20f7b9da436e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.920363 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ms7b4\" (UniqueName: \"kubernetes.io/projected/e0cd75e7-19e1-431d-a863-c4ed52878e91-kube-api-access-ms7b4\") pod \"cert-manager-858654f9db-c2s9c\" (UID: \"e0cd75e7-19e1-431d-a863-c4ed52878e91\") " pod="cert-manager/cert-manager-858654f9db-c2s9c" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.920443 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvht7\" (UniqueName: \"kubernetes.io/projected/7ef174dc-35a4-4a44-a5a5-7f7d48284b14-kube-api-access-mvht7\") pod \"cert-manager-webhook-687f57d79b-rgz26\" (UID: \"7ef174dc-35a4-4a44-a5a5-7f7d48284b14\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.936990 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-279n9\" (UniqueName: \"kubernetes.io/projected/693f7934-75ca-41bc-9bc1-20f7b9da436e-kube-api-access-279n9\") pod \"cert-manager-cainjector-cf98fcc89-b4mlm\" (UID: \"693f7934-75ca-41bc-9bc1-20f7b9da436e\") " pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" Mar 20 11:08:06 crc kubenswrapper[4772]: I0320 11:08:06.939186 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ms7b4\" (UniqueName: \"kubernetes.io/projected/e0cd75e7-19e1-431d-a863-c4ed52878e91-kube-api-access-ms7b4\") pod \"cert-manager-858654f9db-c2s9c\" (UID: \"e0cd75e7-19e1-431d-a863-c4ed52878e91\") " pod="cert-manager/cert-manager-858654f9db-c2s9c" Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.022884 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvht7\" (UniqueName: \"kubernetes.io/projected/7ef174dc-35a4-4a44-a5a5-7f7d48284b14-kube-api-access-mvht7\") pod \"cert-manager-webhook-687f57d79b-rgz26\" (UID: \"7ef174dc-35a4-4a44-a5a5-7f7d48284b14\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.025867 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.043721 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvht7\" (UniqueName: \"kubernetes.io/projected/7ef174dc-35a4-4a44-a5a5-7f7d48284b14-kube-api-access-mvht7\") pod \"cert-manager-webhook-687f57d79b-rgz26\" (UID: \"7ef174dc-35a4-4a44-a5a5-7f7d48284b14\") " pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.059698 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-858654f9db-c2s9c" Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.074460 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.273757 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm"] Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.310939 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-858654f9db-c2s9c"] Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.345882 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-687f57d79b-rgz26"] Mar 20 11:08:07 crc kubenswrapper[4772]: W0320 11:08:07.348417 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7ef174dc_35a4_4a44_a5a5_7f7d48284b14.slice/crio-840393c522a88b98616b5ea4cc59a372ab9d515e2e44c45f5a7219c6ef39778b WatchSource:0}: Error finding container 840393c522a88b98616b5ea4cc59a372ab9d515e2e44c45f5a7219c6ef39778b: Status 404 returned error can't find the container with id 840393c522a88b98616b5ea4cc59a372ab9d515e2e44c45f5a7219c6ef39778b Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.833449 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" event={"ID":"7ef174dc-35a4-4a44-a5a5-7f7d48284b14","Type":"ContainerStarted","Data":"840393c522a88b98616b5ea4cc59a372ab9d515e2e44c45f5a7219c6ef39778b"} Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.835898 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" event={"ID":"693f7934-75ca-41bc-9bc1-20f7b9da436e","Type":"ContainerStarted","Data":"8aa4e9f3fa1e0831c0cd3a56dfff4cb1574b59bfcffcca471905ed46b1cc2d79"} Mar 20 11:08:07 crc kubenswrapper[4772]: I0320 11:08:07.838126 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c2s9c" event={"ID":"e0cd75e7-19e1-431d-a863-c4ed52878e91","Type":"ContainerStarted","Data":"68eb2510f6d14cd8a1a19530cb2d2b22b0996d486bc5453a691e4e0a22b5ffcf"} Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.870801 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" event={"ID":"693f7934-75ca-41bc-9bc1-20f7b9da436e","Type":"ContainerStarted","Data":"c5980f49e7f82cc10c9701fc0337295d48817034f8bbfd638b8b46326cb87d8f"} Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.872748 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-858654f9db-c2s9c" event={"ID":"e0cd75e7-19e1-431d-a863-c4ed52878e91","Type":"ContainerStarted","Data":"493ead542e1084a2bc1d6dec4d1a9511290d4bd03c128337111284d1f89b985b"} Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.875766 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" event={"ID":"7ef174dc-35a4-4a44-a5a5-7f7d48284b14","Type":"ContainerStarted","Data":"b214c744fdc776436498528e0d7fcef8b832f2903b1d8d01fa3e1f5d3b863434"} Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.876117 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.887628 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-cf98fcc89-b4mlm" podStartSLOduration=1.523231019 podStartE2EDuration="7.887609238s" podCreationTimestamp="2026-03-20 11:08:06 +0000 UTC" firstStartedPulling="2026-03-20 11:08:07.283484894 +0000 UTC m=+773.374451369" lastFinishedPulling="2026-03-20 11:08:13.647863113 +0000 UTC m=+779.738829588" observedRunningTime="2026-03-20 11:08:13.88332618 +0000 UTC m=+779.974292685" watchObservedRunningTime="2026-03-20 11:08:13.887609238 +0000 UTC m=+779.978575723" Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.907603 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" podStartSLOduration=2.520119322 podStartE2EDuration="7.907579475s" podCreationTimestamp="2026-03-20 11:08:06 +0000 UTC" firstStartedPulling="2026-03-20 11:08:07.351071183 +0000 UTC m=+773.442037668" lastFinishedPulling="2026-03-20 11:08:12.738531336 +0000 UTC m=+778.829497821" observedRunningTime="2026-03-20 11:08:13.900964463 +0000 UTC m=+779.991930948" watchObservedRunningTime="2026-03-20 11:08:13.907579475 +0000 UTC m=+779.998545960" Mar 20 11:08:13 crc kubenswrapper[4772]: I0320 11:08:13.922993 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-858654f9db-c2s9c" podStartSLOduration=2.764984754 podStartE2EDuration="7.922971187s" podCreationTimestamp="2026-03-20 11:08:06 +0000 UTC" firstStartedPulling="2026-03-20 11:08:07.316472932 +0000 UTC m=+773.407439417" lastFinishedPulling="2026-03-20 11:08:12.474459365 +0000 UTC m=+778.565425850" observedRunningTime="2026-03-20 11:08:13.919596684 +0000 UTC m=+780.010563169" watchObservedRunningTime="2026-03-20 11:08:13.922971187 +0000 UTC m=+780.013937672" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.587105 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8p9x"] Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.587862 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-controller" containerID="cri-o://1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.588003 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="northd" containerID="cri-o://2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.588031 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="sbdb" containerID="cri-o://58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.588081 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="nbdb" containerID="cri-o://09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.588172 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-node" containerID="cri-o://7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.588179 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.588216 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-acl-logging" containerID="cri-o://fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.625831 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" containerID="cri-o://993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c" gracePeriod=30 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.896458 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovnkube-controller/3.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.899589 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovn-acl-logging/0.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900121 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovn-controller/0.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900475 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c" exitCode=0 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900504 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1" exitCode=0 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900515 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e" exitCode=0 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900526 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26" exitCode=0 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900535 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5" exitCode=143 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900544 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa" exitCode=143 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900593 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900627 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900639 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900652 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900676 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.900696 4772 scope.go:117] "RemoveContainer" containerID="12a696e3267de94fceb5cfe6208dadb838e263678fef9ba7548d3f6496ea56a4" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.907156 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/2.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.908162 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/1.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.908234 4772 generic.go:334] "Generic (PLEG): container finished" podID="a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d" containerID="d695a81a4a5904d8b259c89eb9d85c68d7f8c623f84f0a020356bde5e5f9cbc3" exitCode=2 Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.908285 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerDied","Data":"d695a81a4a5904d8b259c89eb9d85c68d7f8c623f84f0a020356bde5e5f9cbc3"} Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.908902 4772 scope.go:117] "RemoveContainer" containerID="d695a81a4a5904d8b259c89eb9d85c68d7f8c623f84f0a020356bde5e5f9cbc3" Mar 20 11:08:16 crc kubenswrapper[4772]: E0320 11:08:16.909179 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-7fpq9_openshift-multus(a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d)\"" pod="openshift-multus/multus-7fpq9" podUID="a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.980462 4772 scope.go:117] "RemoveContainer" containerID="b8e4983a9a26fa55a1dc7ac0d0e730226faa13ce43dd93b98ad9e59b381ed003" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.985341 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovn-acl-logging/0.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.985790 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovn-controller/0.log" Mar 20 11:08:16 crc kubenswrapper[4772]: I0320 11:08:16.986490 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044537 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-jt9kx"] Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044780 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="sbdb" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044800 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="sbdb" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044815 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044823 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044854 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="nbdb" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044863 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="nbdb" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044874 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044881 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044891 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="northd" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044898 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="northd" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044911 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-acl-logging" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044919 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-acl-logging" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044930 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044937 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044947 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kubecfg-setup" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044955 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kubecfg-setup" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044968 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-node" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044975 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-node" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044984 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.044991 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.044999 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045006 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.045016 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045023 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045151 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="sbdb" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045161 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045171 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045185 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="nbdb" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045195 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-ovn-metrics" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045204 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="kube-rbac-proxy-node" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045212 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045224 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovn-acl-logging" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045234 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="northd" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045242 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: E0320 11:08:17.045352 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045361 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045460 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.045625 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="d62da04c-5422-4320-9352-8959b89501be" containerName="ovnkube-controller" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.046947 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.054818 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-bin\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.054903 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.054917 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-netns\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.054947 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.054953 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-systemd\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055011 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-var-lib-openvswitch\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055074 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-kubelet\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055116 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055170 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055186 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055219 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-openvswitch\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055229 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055249 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-log-socket\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055262 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055300 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-etc-openvswitch\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055346 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-config\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055387 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-script-lib\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055433 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-ovn\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055387 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055477 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-netd\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055527 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055582 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055711 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-slash\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055761 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-ovn-kubernetes\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055799 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-node-log\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055861 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-env-overrides\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055831 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-slash" (OuterVolumeSpecName: "host-slash") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055928 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-node-log" (OuterVolumeSpecName: "node-log") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055941 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.055901 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-systemd-units\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056007 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62da04c-5422-4320-9352-8959b89501be-ovn-node-metrics-cert\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056035 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-js95g\" (UniqueName: \"kubernetes.io/projected/d62da04c-5422-4320-9352-8959b89501be-kube-api-access-js95g\") pod \"d62da04c-5422-4320-9352-8959b89501be\" (UID: \"d62da04c-5422-4320-9352-8959b89501be\") " Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056185 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056279 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovnkube-script-lib\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056323 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-systemd-units\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056389 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9hfmx\" (UniqueName: \"kubernetes.io/projected/91122d8e-5f5b-4495-a225-b9eae87b3f4f-kube-api-access-9hfmx\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056447 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-systemd\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056489 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056542 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovn-node-metrics-cert\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056579 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-slash\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056615 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-var-lib-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056619 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056655 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-kubelet\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056699 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-node-log\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056748 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-log-socket" (OuterVolumeSpecName: "log-socket") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056783 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056813 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-cni-bin\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.056921 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovnkube-config\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057190 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057291 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057317 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-etc-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057372 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-cni-netd\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057430 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-log-socket\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057488 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-ovn\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.057554 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.058909 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-run-netns\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.058982 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-env-overrides\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059136 4772 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-bin\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059153 4772 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-netns\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059168 4772 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059180 4772 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-kubelet\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059192 4772 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059207 4772 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059218 4772 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-log-socket\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059229 4772 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059242 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059253 4772 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059272 4772 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-ovn\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059283 4772 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-cni-netd\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059297 4772 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-slash\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059309 4772 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059446 4772 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-node-log\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059468 4772 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d62da04c-5422-4320-9352-8959b89501be-env-overrides\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.059482 4772 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-systemd-units\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.060448 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d62da04c-5422-4320-9352-8959b89501be-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.060488 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d62da04c-5422-4320-9352-8959b89501be-kube-api-access-js95g" (OuterVolumeSpecName: "kube-api-access-js95g") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "kube-api-access-js95g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.069667 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d62da04c-5422-4320-9352-8959b89501be" (UID: "d62da04c-5422-4320-9352-8959b89501be"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.079515 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-687f57d79b-rgz26" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160779 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-log-socket\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160857 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-ovn\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160890 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160938 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-run-netns\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160960 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-env-overrides\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160971 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-log-socket\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.160987 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-systemd-units\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161024 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-run-netns\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161064 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovnkube-script-lib\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161110 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-systemd-units\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-ovn\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161125 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9hfmx\" (UniqueName: \"kubernetes.io/projected/91122d8e-5f5b-4495-a225-b9eae87b3f4f-kube-api-access-9hfmx\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161247 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-systemd\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161285 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161344 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovn-node-metrics-cert\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161360 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-run-systemd\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161371 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-slash\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161419 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-slash\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161441 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-var-lib-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161479 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-kubelet\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161507 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-node-log\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161532 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-cni-bin\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161555 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovnkube-config\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161580 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161611 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-etc-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161630 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-env-overrides\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-cni-netd\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161672 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-cni-netd\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161731 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-var-lib-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161750 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-kubelet\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161764 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-node-log\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161779 4772 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d62da04c-5422-4320-9352-8959b89501be-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161793 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-cni-bin\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161804 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-js95g\" (UniqueName: \"kubernetes.io/projected/d62da04c-5422-4320-9352-8959b89501be-kube-api-access-js95g\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161814 4772 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d62da04c-5422-4320-9352-8959b89501be-run-systemd\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161887 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-run-ovn-kubernetes\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.161922 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-etc-openvswitch\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.162477 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovnkube-config\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.162582 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovnkube-script-lib\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.163780 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/91122d8e-5f5b-4495-a225-b9eae87b3f4f-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.165560 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/91122d8e-5f5b-4495-a225-b9eae87b3f4f-ovn-node-metrics-cert\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.179673 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9hfmx\" (UniqueName: \"kubernetes.io/projected/91122d8e-5f5b-4495-a225-b9eae87b3f4f-kube-api-access-9hfmx\") pod \"ovnkube-node-jt9kx\" (UID: \"91122d8e-5f5b-4495-a225-b9eae87b3f4f\") " pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.360672 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:17 crc kubenswrapper[4772]: W0320 11:08:17.385529 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod91122d8e_5f5b_4495_a225_b9eae87b3f4f.slice/crio-fa16794973675f70480c278e69229faa052aeeac9a7d80b155a6c4018dfeaedc WatchSource:0}: Error finding container fa16794973675f70480c278e69229faa052aeeac9a7d80b155a6c4018dfeaedc: Status 404 returned error can't find the container with id fa16794973675f70480c278e69229faa052aeeac9a7d80b155a6c4018dfeaedc Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.918666 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/2.log" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.921215 4772 generic.go:334] "Generic (PLEG): container finished" podID="91122d8e-5f5b-4495-a225-b9eae87b3f4f" containerID="3a31bb718d5d958bc1bcd0009015d4992fa2b549718479cca9adf414c4259b06" exitCode=0 Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.921664 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerDied","Data":"3a31bb718d5d958bc1bcd0009015d4992fa2b549718479cca9adf414c4259b06"} Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.921762 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"fa16794973675f70480c278e69229faa052aeeac9a7d80b155a6c4018dfeaedc"} Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.927420 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovn-acl-logging/0.log" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.928052 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-z8p9x_d62da04c-5422-4320-9352-8959b89501be/ovn-controller/0.log" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931153 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79" exitCode=0 Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931193 4772 generic.go:334] "Generic (PLEG): container finished" podID="d62da04c-5422-4320-9352-8959b89501be" containerID="2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b" exitCode=0 Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931232 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79"} Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931283 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b"} Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" event={"ID":"d62da04c-5422-4320-9352-8959b89501be","Type":"ContainerDied","Data":"b510773ae06d1b3a3b1821abb61a4eb0f7748a9df1b3b32be318e47589d9cc9d"} Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931320 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-z8p9x" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.931324 4772 scope.go:117] "RemoveContainer" containerID="993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.967441 4772 scope.go:117] "RemoveContainer" containerID="58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1" Mar 20 11:08:17 crc kubenswrapper[4772]: I0320 11:08:17.997540 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8p9x"] Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.004468 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-z8p9x"] Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.021742 4772 scope.go:117] "RemoveContainer" containerID="09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.057654 4772 scope.go:117] "RemoveContainer" containerID="2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.090613 4772 scope.go:117] "RemoveContainer" containerID="351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.107687 4772 scope.go:117] "RemoveContainer" containerID="7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.124409 4772 scope.go:117] "RemoveContainer" containerID="fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.145101 4772 scope.go:117] "RemoveContainer" containerID="1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.167422 4772 scope.go:117] "RemoveContainer" containerID="f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.213771 4772 scope.go:117] "RemoveContainer" containerID="993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.214191 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c\": container with ID starting with 993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c not found: ID does not exist" containerID="993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.214230 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c"} err="failed to get container status \"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c\": rpc error: code = NotFound desc = could not find container \"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c\": container with ID starting with 993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.214259 4772 scope.go:117] "RemoveContainer" containerID="58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.214502 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\": container with ID starting with 58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1 not found: ID does not exist" containerID="58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.214530 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1"} err="failed to get container status \"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\": rpc error: code = NotFound desc = could not find container \"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\": container with ID starting with 58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.214550 4772 scope.go:117] "RemoveContainer" containerID="09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.214739 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\": container with ID starting with 09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79 not found: ID does not exist" containerID="09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.214766 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79"} err="failed to get container status \"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\": rpc error: code = NotFound desc = could not find container \"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\": container with ID starting with 09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.214786 4772 scope.go:117] "RemoveContainer" containerID="2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.215187 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\": container with ID starting with 2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b not found: ID does not exist" containerID="2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.215218 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b"} err="failed to get container status \"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\": rpc error: code = NotFound desc = could not find container \"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\": container with ID starting with 2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.215235 4772 scope.go:117] "RemoveContainer" containerID="351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.215664 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\": container with ID starting with 351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e not found: ID does not exist" containerID="351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.215695 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e"} err="failed to get container status \"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\": rpc error: code = NotFound desc = could not find container \"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\": container with ID starting with 351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.215715 4772 scope.go:117] "RemoveContainer" containerID="7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.216015 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\": container with ID starting with 7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26 not found: ID does not exist" containerID="7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.216044 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26"} err="failed to get container status \"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\": rpc error: code = NotFound desc = could not find container \"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\": container with ID starting with 7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.216062 4772 scope.go:117] "RemoveContainer" containerID="fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.216408 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\": container with ID starting with fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5 not found: ID does not exist" containerID="fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.216481 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5"} err="failed to get container status \"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\": rpc error: code = NotFound desc = could not find container \"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\": container with ID starting with fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.216539 4772 scope.go:117] "RemoveContainer" containerID="1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.216895 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\": container with ID starting with 1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa not found: ID does not exist" containerID="1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.216921 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa"} err="failed to get container status \"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\": rpc error: code = NotFound desc = could not find container \"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\": container with ID starting with 1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.216938 4772 scope.go:117] "RemoveContainer" containerID="f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d" Mar 20 11:08:18 crc kubenswrapper[4772]: E0320 11:08:18.217296 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\": container with ID starting with f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d not found: ID does not exist" containerID="f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.217325 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d"} err="failed to get container status \"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\": rpc error: code = NotFound desc = could not find container \"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\": container with ID starting with f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.217340 4772 scope.go:117] "RemoveContainer" containerID="993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.217587 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c"} err="failed to get container status \"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c\": rpc error: code = NotFound desc = could not find container \"993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c\": container with ID starting with 993c7bc46c15d3c85b6255a00444169a318e0abc1daf81cca7545967b02f535c not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.217612 4772 scope.go:117] "RemoveContainer" containerID="58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.217942 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1"} err="failed to get container status \"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\": rpc error: code = NotFound desc = could not find container \"58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1\": container with ID starting with 58d46226b2170c3116f7de9a71857b9a6a74c8ba5b89ee47026a4725e3220fb1 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.217976 4772 scope.go:117] "RemoveContainer" containerID="09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.218310 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79"} err="failed to get container status \"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\": rpc error: code = NotFound desc = could not find container \"09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79\": container with ID starting with 09ea2270b8fec564b8fc8f8283195f8f299af27b44f6a8b6819b12963b688b79 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.218336 4772 scope.go:117] "RemoveContainer" containerID="2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.218573 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b"} err="failed to get container status \"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\": rpc error: code = NotFound desc = could not find container \"2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b\": container with ID starting with 2292d6baade04cef70b2f31ce3337a89c9569c68bd94585c605f65e9ac7f1b9b not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.218596 4772 scope.go:117] "RemoveContainer" containerID="351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.218898 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e"} err="failed to get container status \"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\": rpc error: code = NotFound desc = could not find container \"351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e\": container with ID starting with 351ac83c7a811dea9f86d99fa4f9d4ba9fa28d03e1e2ae3c82a2f6078fb1da0e not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.218946 4772 scope.go:117] "RemoveContainer" containerID="7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.219233 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26"} err="failed to get container status \"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\": rpc error: code = NotFound desc = could not find container \"7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26\": container with ID starting with 7424ee213e1033b4f8aa6c17b5f920371452946e3ae25b2f46b403a911564b26 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.219278 4772 scope.go:117] "RemoveContainer" containerID="fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.219602 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5"} err="failed to get container status \"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\": rpc error: code = NotFound desc = could not find container \"fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5\": container with ID starting with fad9a7cf72bb0186766dc5fc87ce6d931ed2dcbdc5e2d9cc1f80cb9d2481aee5 not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.219627 4772 scope.go:117] "RemoveContainer" containerID="1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.219956 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa"} err="failed to get container status \"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\": rpc error: code = NotFound desc = could not find container \"1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa\": container with ID starting with 1b92df256299bf1cc26e729c9595ec84cd318102477b32b77fe53c7e8baa33aa not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.220008 4772 scope.go:117] "RemoveContainer" containerID="f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.220502 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d"} err="failed to get container status \"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\": rpc error: code = NotFound desc = could not find container \"f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d\": container with ID starting with f2e62b487ef77a57b072dbf6b798f33d1b3b8cd9c3575d0ee3b176647a00533d not found: ID does not exist" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.649122 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d62da04c-5422-4320-9352-8959b89501be" path="/var/lib/kubelet/pods/d62da04c-5422-4320-9352-8959b89501be/volumes" Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.938637 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"1f3a36397d2632fe38b0db8746704be9f0b9be8b8326a6d4c440c096666a309a"} Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.938680 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"090306ca99bd205f25f9898e533bd8fd3a43cd7a6289e96053bd4a3385d24783"} Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.938694 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"cfef7d9cc1ca3971309ebf5508e327bd951b957c3d3a30fe6c616a3d4d3997ac"} Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.938704 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"d747931f841095f56bc151543975c9515f2e3efd44019b1b735b8b51bc13d5b3"} Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.938716 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"fbc230a19aa214fdb1ce93bc0af0346a50cea5acdbbcab4840af08de2e0cec4a"} Mar 20 11:08:18 crc kubenswrapper[4772]: I0320 11:08:18.938725 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"840ab39aaa7080b82505c11a498f8c557155c4b1aa055b353406405f3a4207e9"} Mar 20 11:08:21 crc kubenswrapper[4772]: I0320 11:08:21.964677 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"6da879da0d683717970a53d9f239de3012e1f762222e8bd46fb65602911862d9"} Mar 20 11:08:23 crc kubenswrapper[4772]: I0320 11:08:23.984782 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" event={"ID":"91122d8e-5f5b-4495-a225-b9eae87b3f4f","Type":"ContainerStarted","Data":"717b2397963b38d45f730a9a831c36f059a3bfdb426f376010440f6e2c59e3d1"} Mar 20 11:08:23 crc kubenswrapper[4772]: I0320 11:08:23.985511 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:23 crc kubenswrapper[4772]: I0320 11:08:23.985690 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:24 crc kubenswrapper[4772]: I0320 11:08:24.016756 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" podStartSLOduration=7.016726311 podStartE2EDuration="7.016726311s" podCreationTimestamp="2026-03-20 11:08:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:08:24.013972756 +0000 UTC m=+790.104939271" watchObservedRunningTime="2026-03-20 11:08:24.016726311 +0000 UTC m=+790.107692806" Mar 20 11:08:24 crc kubenswrapper[4772]: I0320 11:08:24.022716 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:24 crc kubenswrapper[4772]: I0320 11:08:24.995615 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:25 crc kubenswrapper[4772]: I0320 11:08:25.034989 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:28 crc kubenswrapper[4772]: I0320 11:08:28.372798 4772 scope.go:117] "RemoveContainer" containerID="4ec9887e0ae666ed50a471610a9fbc68993fbf0bc75de98e2f2ff14b6350cb7f" Mar 20 11:08:30 crc kubenswrapper[4772]: I0320 11:08:30.641831 4772 scope.go:117] "RemoveContainer" containerID="d695a81a4a5904d8b259c89eb9d85c68d7f8c623f84f0a020356bde5e5f9cbc3" Mar 20 11:08:31 crc kubenswrapper[4772]: I0320 11:08:31.032386 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-7fpq9_a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d/kube-multus/2.log" Mar 20 11:08:31 crc kubenswrapper[4772]: I0320 11:08:31.032791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-7fpq9" event={"ID":"a04aeeb5-2fa5-4466-ac01-e8d9fb19a88d","Type":"ContainerStarted","Data":"f5e3782a1627d36eab3a3112d9d348ccbbfb6b49f1303398e51f54aa7328585c"} Mar 20 11:08:39 crc kubenswrapper[4772]: I0320 11:08:39.564968 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:08:39 crc kubenswrapper[4772]: I0320 11:08:39.565502 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:08:47 crc kubenswrapper[4772]: I0320 11:08:47.392055 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-jt9kx" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.640635 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2"] Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.642152 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.646605 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.657259 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2"] Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.733646 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.733706 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdcf9\" (UniqueName: \"kubernetes.io/projected/6f03ad17-5689-47ee-87f9-cb5562711b9d-kube-api-access-gdcf9\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.733792 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.835039 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.835562 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.835717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gdcf9\" (UniqueName: \"kubernetes.io/projected/6f03ad17-5689-47ee-87f9-cb5562711b9d-kube-api-access-gdcf9\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.835628 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-util\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.836066 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-bundle\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.859810 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdcf9\" (UniqueName: \"kubernetes.io/projected/6f03ad17-5689-47ee-87f9-cb5562711b9d-kube-api-access-gdcf9\") pod \"1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:53 crc kubenswrapper[4772]: I0320 11:08:53.983069 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:54 crc kubenswrapper[4772]: I0320 11:08:54.201256 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2"] Mar 20 11:08:54 crc kubenswrapper[4772]: I0320 11:08:54.778362 4772 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Mar 20 11:08:55 crc kubenswrapper[4772]: I0320 11:08:55.176948 4772 generic.go:334] "Generic (PLEG): container finished" podID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerID="20b8cecdbc863f4c8499031b76ca71505d4d95f191f056c8707bf432f333bb6a" exitCode=0 Mar 20 11:08:55 crc kubenswrapper[4772]: I0320 11:08:55.177047 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" event={"ID":"6f03ad17-5689-47ee-87f9-cb5562711b9d","Type":"ContainerDied","Data":"20b8cecdbc863f4c8499031b76ca71505d4d95f191f056c8707bf432f333bb6a"} Mar 20 11:08:55 crc kubenswrapper[4772]: I0320 11:08:55.177297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" event={"ID":"6f03ad17-5689-47ee-87f9-cb5562711b9d","Type":"ContainerStarted","Data":"0f7538b90f5481e04dd9d5e503831e610cab5e6029292594cdf1e229327552e7"} Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.000039 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-rvm67"] Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.002630 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.020987 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rvm67"] Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.069035 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-utilities\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.069364 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27fmd\" (UniqueName: \"kubernetes.io/projected/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-kube-api-access-27fmd\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.069475 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-catalog-content\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.170671 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-utilities\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.170983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27fmd\" (UniqueName: \"kubernetes.io/projected/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-kube-api-access-27fmd\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.171213 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-catalog-content\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.171231 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-utilities\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.171437 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-catalog-content\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.197960 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27fmd\" (UniqueName: \"kubernetes.io/projected/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-kube-api-access-27fmd\") pod \"redhat-operators-rvm67\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.329886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:08:56 crc kubenswrapper[4772]: I0320 11:08:56.523033 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-rvm67"] Mar 20 11:08:56 crc kubenswrapper[4772]: W0320 11:08:56.564647 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod56d3e8ea_3da8_47b1_95a1_d3a062b74fab.slice/crio-f9fe9124de9052a10528d90a9151b077eae9ac0a16a3f61f5a156f222e27c313 WatchSource:0}: Error finding container f9fe9124de9052a10528d90a9151b077eae9ac0a16a3f61f5a156f222e27c313: Status 404 returned error can't find the container with id f9fe9124de9052a10528d90a9151b077eae9ac0a16a3f61f5a156f222e27c313 Mar 20 11:08:57 crc kubenswrapper[4772]: I0320 11:08:57.189167 4772 generic.go:334] "Generic (PLEG): container finished" podID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerID="43155d6d820e8a85752f31086e3f128c376feb974c8f5148d1c73a4a6d88abf6" exitCode=0 Mar 20 11:08:57 crc kubenswrapper[4772]: I0320 11:08:57.189254 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerDied","Data":"43155d6d820e8a85752f31086e3f128c376feb974c8f5148d1c73a4a6d88abf6"} Mar 20 11:08:57 crc kubenswrapper[4772]: I0320 11:08:57.189282 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerStarted","Data":"f9fe9124de9052a10528d90a9151b077eae9ac0a16a3f61f5a156f222e27c313"} Mar 20 11:08:57 crc kubenswrapper[4772]: I0320 11:08:57.191030 4772 generic.go:334] "Generic (PLEG): container finished" podID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerID="422295c0b2aa2adecb683e426d08591042597ed8b8a232426f83965e0531b44d" exitCode=0 Mar 20 11:08:57 crc kubenswrapper[4772]: I0320 11:08:57.191070 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" event={"ID":"6f03ad17-5689-47ee-87f9-cb5562711b9d","Type":"ContainerDied","Data":"422295c0b2aa2adecb683e426d08591042597ed8b8a232426f83965e0531b44d"} Mar 20 11:08:58 crc kubenswrapper[4772]: I0320 11:08:58.200073 4772 generic.go:334] "Generic (PLEG): container finished" podID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerID="243cc0985dba5e331d6b8e555b088649d3b2538b80d46905031bb1b0e1f33a44" exitCode=0 Mar 20 11:08:58 crc kubenswrapper[4772]: I0320 11:08:58.200134 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" event={"ID":"6f03ad17-5689-47ee-87f9-cb5562711b9d","Type":"ContainerDied","Data":"243cc0985dba5e331d6b8e555b088649d3b2538b80d46905031bb1b0e1f33a44"} Mar 20 11:08:58 crc kubenswrapper[4772]: I0320 11:08:58.203504 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerStarted","Data":"3b9438923d40993dc02858e17b49d98d657076d154a0435aa49789ea07906637"} Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.215055 4772 generic.go:334] "Generic (PLEG): container finished" podID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerID="3b9438923d40993dc02858e17b49d98d657076d154a0435aa49789ea07906637" exitCode=0 Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.215177 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerDied","Data":"3b9438923d40993dc02858e17b49d98d657076d154a0435aa49789ea07906637"} Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.462899 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.617205 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-bundle\") pod \"6f03ad17-5689-47ee-87f9-cb5562711b9d\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.617441 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdcf9\" (UniqueName: \"kubernetes.io/projected/6f03ad17-5689-47ee-87f9-cb5562711b9d-kube-api-access-gdcf9\") pod \"6f03ad17-5689-47ee-87f9-cb5562711b9d\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.618219 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-bundle" (OuterVolumeSpecName: "bundle") pod "6f03ad17-5689-47ee-87f9-cb5562711b9d" (UID: "6f03ad17-5689-47ee-87f9-cb5562711b9d"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.618900 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-util\") pod \"6f03ad17-5689-47ee-87f9-cb5562711b9d\" (UID: \"6f03ad17-5689-47ee-87f9-cb5562711b9d\") " Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.619339 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.624036 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f03ad17-5689-47ee-87f9-cb5562711b9d-kube-api-access-gdcf9" (OuterVolumeSpecName: "kube-api-access-gdcf9") pod "6f03ad17-5689-47ee-87f9-cb5562711b9d" (UID: "6f03ad17-5689-47ee-87f9-cb5562711b9d"). InnerVolumeSpecName "kube-api-access-gdcf9". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.638945 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-util" (OuterVolumeSpecName: "util") pod "6f03ad17-5689-47ee-87f9-cb5562711b9d" (UID: "6f03ad17-5689-47ee-87f9-cb5562711b9d"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.721069 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gdcf9\" (UniqueName: \"kubernetes.io/projected/6f03ad17-5689-47ee-87f9-cb5562711b9d-kube-api-access-gdcf9\") on node \"crc\" DevicePath \"\"" Mar 20 11:08:59 crc kubenswrapper[4772]: I0320 11:08:59.721102 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/6f03ad17-5689-47ee-87f9-cb5562711b9d-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:00 crc kubenswrapper[4772]: I0320 11:09:00.225947 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" event={"ID":"6f03ad17-5689-47ee-87f9-cb5562711b9d","Type":"ContainerDied","Data":"0f7538b90f5481e04dd9d5e503831e610cab5e6029292594cdf1e229327552e7"} Mar 20 11:09:00 crc kubenswrapper[4772]: I0320 11:09:00.226354 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0f7538b90f5481e04dd9d5e503831e610cab5e6029292594cdf1e229327552e7" Mar 20 11:09:00 crc kubenswrapper[4772]: I0320 11:09:00.225983 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2" Mar 20 11:09:00 crc kubenswrapper[4772]: I0320 11:09:00.231213 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerStarted","Data":"f0b86c769b631671f3b3b15b970fe15800b27a9e0a8678702e0b8f4889806ff6"} Mar 20 11:09:00 crc kubenswrapper[4772]: I0320 11:09:00.249472 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-rvm67" podStartSLOduration=2.785201604 podStartE2EDuration="5.24945249s" podCreationTimestamp="2026-03-20 11:08:55 +0000 UTC" firstStartedPulling="2026-03-20 11:08:57.190812804 +0000 UTC m=+823.281779289" lastFinishedPulling="2026-03-20 11:08:59.65506366 +0000 UTC m=+825.746030175" observedRunningTime="2026-03-20 11:09:00.248128954 +0000 UTC m=+826.339095439" watchObservedRunningTime="2026-03-20 11:09:00.24945249 +0000 UTC m=+826.340418975" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.716313 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f2f98"] Mar 20 11:09:03 crc kubenswrapper[4772]: E0320 11:09:03.716922 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="pull" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.716940 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="pull" Mar 20 11:09:03 crc kubenswrapper[4772]: E0320 11:09:03.716956 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="util" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.716965 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="util" Mar 20 11:09:03 crc kubenswrapper[4772]: E0320 11:09:03.716976 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="extract" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.716984 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="extract" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.717111 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6f03ad17-5689-47ee-87f9-cb5562711b9d" containerName="extract" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.717592 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.719739 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-operator-dockercfg-5dmt4" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.719740 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"kube-root-ca.crt" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.719768 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"openshift-service-ca.crt" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.733253 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f2f98"] Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.875976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wlpvz\" (UniqueName: \"kubernetes.io/projected/0c24388b-eb64-4fc4-a732-4dd168057b7a-kube-api-access-wlpvz\") pod \"nmstate-operator-796d4cfff4-f2f98\" (UID: \"0c24388b-eb64-4fc4-a732-4dd168057b7a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.977620 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wlpvz\" (UniqueName: \"kubernetes.io/projected/0c24388b-eb64-4fc4-a732-4dd168057b7a-kube-api-access-wlpvz\") pod \"nmstate-operator-796d4cfff4-f2f98\" (UID: \"0c24388b-eb64-4fc4-a732-4dd168057b7a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" Mar 20 11:09:03 crc kubenswrapper[4772]: I0320 11:09:03.997748 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wlpvz\" (UniqueName: \"kubernetes.io/projected/0c24388b-eb64-4fc4-a732-4dd168057b7a-kube-api-access-wlpvz\") pod \"nmstate-operator-796d4cfff4-f2f98\" (UID: \"0c24388b-eb64-4fc4-a732-4dd168057b7a\") " pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" Mar 20 11:09:04 crc kubenswrapper[4772]: I0320 11:09:04.038056 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" Mar 20 11:09:04 crc kubenswrapper[4772]: I0320 11:09:04.451778 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-operator-796d4cfff4-f2f98"] Mar 20 11:09:04 crc kubenswrapper[4772]: W0320 11:09:04.457540 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c24388b_eb64_4fc4_a732_4dd168057b7a.slice/crio-bd4854b9fedd6977ece8ddefffd6675840c6de4657c4d52936addc065a84fcf7 WatchSource:0}: Error finding container bd4854b9fedd6977ece8ddefffd6675840c6de4657c4d52936addc065a84fcf7: Status 404 returned error can't find the container with id bd4854b9fedd6977ece8ddefffd6675840c6de4657c4d52936addc065a84fcf7 Mar 20 11:09:04 crc kubenswrapper[4772]: I0320 11:09:04.460848 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:09:05 crc kubenswrapper[4772]: I0320 11:09:05.256615 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" event={"ID":"0c24388b-eb64-4fc4-a732-4dd168057b7a","Type":"ContainerStarted","Data":"bd4854b9fedd6977ece8ddefffd6675840c6de4657c4d52936addc065a84fcf7"} Mar 20 11:09:06 crc kubenswrapper[4772]: I0320 11:09:06.330449 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:09:06 crc kubenswrapper[4772]: I0320 11:09:06.330493 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:09:06 crc kubenswrapper[4772]: I0320 11:09:06.368949 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:09:07 crc kubenswrapper[4772]: I0320 11:09:07.268286 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" event={"ID":"0c24388b-eb64-4fc4-a732-4dd168057b7a","Type":"ContainerStarted","Data":"51ff8923669db007ea656e4ff59d24d6b19345dd92f56ba51f2a1c1fef481e32"} Mar 20 11:09:07 crc kubenswrapper[4772]: I0320 11:09:07.282853 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-operator-796d4cfff4-f2f98" podStartSLOduration=2.139525477 podStartE2EDuration="4.282818892s" podCreationTimestamp="2026-03-20 11:09:03 +0000 UTC" firstStartedPulling="2026-03-20 11:09:04.460599529 +0000 UTC m=+830.551566014" lastFinishedPulling="2026-03-20 11:09:06.603892944 +0000 UTC m=+832.694859429" observedRunningTime="2026-03-20 11:09:07.281575928 +0000 UTC m=+833.372542413" watchObservedRunningTime="2026-03-20 11:09:07.282818892 +0000 UTC m=+833.373785377" Mar 20 11:09:07 crc kubenswrapper[4772]: I0320 11:09:07.310919 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:09:08 crc kubenswrapper[4772]: I0320 11:09:08.790067 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rvm67"] Mar 20 11:09:09 crc kubenswrapper[4772]: I0320 11:09:09.279138 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-rvm67" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="registry-server" containerID="cri-o://f0b86c769b631671f3b3b15b970fe15800b27a9e0a8678702e0b8f4889806ff6" gracePeriod=2 Mar 20 11:09:09 crc kubenswrapper[4772]: I0320 11:09:09.564797 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:09:09 crc kubenswrapper[4772]: I0320 11:09:09.564873 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.287255 4772 generic.go:334] "Generic (PLEG): container finished" podID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerID="f0b86c769b631671f3b3b15b970fe15800b27a9e0a8678702e0b8f4889806ff6" exitCode=0 Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.287315 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerDied","Data":"f0b86c769b631671f3b3b15b970fe15800b27a9e0a8678702e0b8f4889806ff6"} Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.702604 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.858142 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27fmd\" (UniqueName: \"kubernetes.io/projected/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-kube-api-access-27fmd\") pod \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.858206 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-utilities\") pod \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.858233 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-catalog-content\") pod \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\" (UID: \"56d3e8ea-3da8-47b1-95a1-d3a062b74fab\") " Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.859289 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-utilities" (OuterVolumeSpecName: "utilities") pod "56d3e8ea-3da8-47b1-95a1-d3a062b74fab" (UID: "56d3e8ea-3da8-47b1-95a1-d3a062b74fab"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.864598 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-kube-api-access-27fmd" (OuterVolumeSpecName: "kube-api-access-27fmd") pod "56d3e8ea-3da8-47b1-95a1-d3a062b74fab" (UID: "56d3e8ea-3da8-47b1-95a1-d3a062b74fab"). InnerVolumeSpecName "kube-api-access-27fmd". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.959899 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27fmd\" (UniqueName: \"kubernetes.io/projected/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-kube-api-access-27fmd\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.959941 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:10 crc kubenswrapper[4772]: I0320 11:09:10.977163 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "56d3e8ea-3da8-47b1-95a1-d3a062b74fab" (UID: "56d3e8ea-3da8-47b1-95a1-d3a062b74fab"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.060578 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/56d3e8ea-3da8-47b1-95a1-d3a062b74fab-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.296306 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-rvm67" event={"ID":"56d3e8ea-3da8-47b1-95a1-d3a062b74fab","Type":"ContainerDied","Data":"f9fe9124de9052a10528d90a9151b077eae9ac0a16a3f61f5a156f222e27c313"} Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.296352 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-rvm67" Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.296362 4772 scope.go:117] "RemoveContainer" containerID="f0b86c769b631671f3b3b15b970fe15800b27a9e0a8678702e0b8f4889806ff6" Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.325454 4772 scope.go:117] "RemoveContainer" containerID="3b9438923d40993dc02858e17b49d98d657076d154a0435aa49789ea07906637" Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.327826 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-rvm67"] Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.334950 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-rvm67"] Mar 20 11:09:11 crc kubenswrapper[4772]: I0320 11:09:11.357448 4772 scope.go:117] "RemoveContainer" containerID="43155d6d820e8a85752f31086e3f128c376feb974c8f5148d1c73a4a6d88abf6" Mar 20 11:09:12 crc kubenswrapper[4772]: I0320 11:09:12.650062 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" path="/var/lib/kubelet/pods/56d3e8ea-3da8-47b1-95a1-d3a062b74fab/volumes" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.591507 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w"] Mar 20 11:09:13 crc kubenswrapper[4772]: E0320 11:09:13.591790 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="registry-server" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.591802 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="registry-server" Mar 20 11:09:13 crc kubenswrapper[4772]: E0320 11:09:13.591830 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="extract-content" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.591853 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="extract-content" Mar 20 11:09:13 crc kubenswrapper[4772]: E0320 11:09:13.591863 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="extract-utilities" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.591871 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="extract-utilities" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.591962 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="56d3e8ea-3da8-47b1-95a1-d3a062b74fab" containerName="registry-server" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.592524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.594049 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"nmstate-handler-dockercfg-wllw7" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.596098 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-cclbs"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.596988 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.598630 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"openshift-nmstate-webhook" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.662181 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-handler-8zk97"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.664857 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.674344 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-cclbs"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.694190 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhtjb\" (UniqueName: \"kubernetes.io/projected/bc2c37a7-9314-4e8d-9e9e-3e98cb73aded-kube-api-access-bhtjb\") pod \"nmstate-webhook-5f558f5558-cclbs\" (UID: \"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.694242 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-crh44\" (UniqueName: \"kubernetes.io/projected/986d86cc-279b-4511-95b8-10f80268aad4-kube-api-access-crh44\") pod \"nmstate-metrics-9b8c8685d-jx96w\" (UID: \"986d86cc-279b-4511-95b8-10f80268aad4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.694281 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bc2c37a7-9314-4e8d-9e9e-3e98cb73aded-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-cclbs\" (UID: \"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.714051 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.770331 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.771183 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.774340 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"plugin-serving-cert" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.774749 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-nmstate"/"nginx-conf" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.775110 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-nmstate"/"default-dockercfg-4mthv" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.796580 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797463 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhtjb\" (UniqueName: \"kubernetes.io/projected/bc2c37a7-9314-4e8d-9e9e-3e98cb73aded-kube-api-access-bhtjb\") pod \"nmstate-webhook-5f558f5558-cclbs\" (UID: \"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797538 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-crh44\" (UniqueName: \"kubernetes.io/projected/986d86cc-279b-4511-95b8-10f80268aad4-kube-api-access-crh44\") pod \"nmstate-metrics-9b8c8685d-jx96w\" (UID: \"986d86cc-279b-4511-95b8-10f80268aad4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797568 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-nmstate-lock\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797597 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-ovs-socket\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797615 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzdht\" (UniqueName: \"kubernetes.io/projected/097b90ac-6ee9-4601-9a4d-db33981b1878-kube-api-access-zzdht\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797640 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bc2c37a7-9314-4e8d-9e9e-3e98cb73aded-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-cclbs\" (UID: \"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.797663 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-dbus-socket\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.818816 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhtjb\" (UniqueName: \"kubernetes.io/projected/bc2c37a7-9314-4e8d-9e9e-3e98cb73aded-kube-api-access-bhtjb\") pod \"nmstate-webhook-5f558f5558-cclbs\" (UID: \"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.819237 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-crh44\" (UniqueName: \"kubernetes.io/projected/986d86cc-279b-4511-95b8-10f80268aad4-kube-api-access-crh44\") pod \"nmstate-metrics-9b8c8685d-jx96w\" (UID: \"986d86cc-279b-4511-95b8-10f80268aad4\") " pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.823513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tls-key-pair\" (UniqueName: \"kubernetes.io/secret/bc2c37a7-9314-4e8d-9e9e-3e98cb73aded-tls-key-pair\") pod \"nmstate-webhook-5f558f5558-cclbs\" (UID: \"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded\") " pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.899004 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49hrd\" (UniqueName: \"kubernetes.io/projected/40b08b4b-058f-409a-9a32-d372878de5ad-kube-api-access-49hrd\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.899498 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-dbus-socket\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.899613 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b08b4b-058f-409a-9a32-d372878de5ad-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.899713 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40b08b4b-058f-409a-9a32-d372878de5ad-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.899791 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-nmstate-lock\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.899930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-ovs-socket\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.900001 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zzdht\" (UniqueName: \"kubernetes.io/projected/097b90ac-6ee9-4601-9a4d-db33981b1878-kube-api-access-zzdht\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.900505 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dbus-socket\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-dbus-socket\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.900619 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nmstate-lock\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-nmstate-lock\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.900710 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovs-socket\" (UniqueName: \"kubernetes.io/host-path/097b90ac-6ee9-4601-9a4d-db33981b1878-ovs-socket\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.922435 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzdht\" (UniqueName: \"kubernetes.io/projected/097b90ac-6ee9-4601-9a4d-db33981b1878-kube-api-access-zzdht\") pod \"nmstate-handler-8zk97\" (UID: \"097b90ac-6ee9-4601-9a4d-db33981b1878\") " pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.956542 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-85fbbb4bfd-lq6wb"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.957359 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.962243 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.970253 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85fbbb4bfd-lq6wb"] Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.977746 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:13 crc kubenswrapper[4772]: I0320 11:09:13.992542 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.001037 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-49hrd\" (UniqueName: \"kubernetes.io/projected/40b08b4b-058f-409a-9a32-d372878de5ad-kube-api-access-49hrd\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.001115 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b08b4b-058f-409a-9a32-d372878de5ad-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.001155 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40b08b4b-058f-409a-9a32-d372878de5ad-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.002332 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40b08b4b-058f-409a-9a32-d372878de5ad-nginx-conf\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.004348 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b08b4b-058f-409a-9a32-d372878de5ad-plugin-serving-cert\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.018543 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-49hrd\" (UniqueName: \"kubernetes.io/projected/40b08b4b-058f-409a-9a32-d372878de5ad-kube-api-access-49hrd\") pod \"nmstate-console-plugin-86f58fcf4-gp8fc\" (UID: \"40b08b4b-058f-409a-9a32-d372878de5ad\") " pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.098367 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.101934 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-service-ca\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.101983 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-trusted-ca-bundle\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.102016 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-oauth-config\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.102038 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25fr2\" (UniqueName: \"kubernetes.io/projected/3be4d7db-c184-45eb-83fb-42b58f936ff4-kube-api-access-25fr2\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.102066 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-config\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.102092 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-serving-cert\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.102147 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-oauth-serving-cert\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.203871 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-config\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.203913 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-serving-cert\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.203967 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-oauth-serving-cert\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.203999 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-service-ca\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.204015 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-trusted-ca-bundle\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.204039 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-oauth-config\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.204097 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-25fr2\" (UniqueName: \"kubernetes.io/projected/3be4d7db-c184-45eb-83fb-42b58f936ff4-kube-api-access-25fr2\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.205825 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-config\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.206468 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-oauth-serving-cert\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.206670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-service-ca\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.209637 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-oauth-config\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.209641 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/3be4d7db-c184-45eb-83fb-42b58f936ff4-trusted-ca-bundle\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.210670 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/3be4d7db-c184-45eb-83fb-42b58f936ff4-console-serving-cert\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.222599 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-25fr2\" (UniqueName: \"kubernetes.io/projected/3be4d7db-c184-45eb-83fb-42b58f936ff4-kube-api-access-25fr2\") pod \"console-85fbbb4bfd-lq6wb\" (UID: \"3be4d7db-c184-45eb-83fb-42b58f936ff4\") " pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.297944 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.317423 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8zk97" event={"ID":"097b90ac-6ee9-4601-9a4d-db33981b1878","Type":"ContainerStarted","Data":"48dd4f3231d8119211ce4cfcb32d1c29c80f5edd1860a69466e4ce8432e984fb"} Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.372771 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w"] Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.437157 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-webhook-5f558f5558-cclbs"] Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.500381 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85fbbb4bfd-lq6wb"] Mar 20 11:09:14 crc kubenswrapper[4772]: W0320 11:09:14.514696 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40b08b4b_058f_409a_9a32_d372878de5ad.slice/crio-3990e1cd9b398db3f15efd68672ee9d1dd6b40b1c9f6ad55a54b6b6bbef579d3 WatchSource:0}: Error finding container 3990e1cd9b398db3f15efd68672ee9d1dd6b40b1c9f6ad55a54b6b6bbef579d3: Status 404 returned error can't find the container with id 3990e1cd9b398db3f15efd68672ee9d1dd6b40b1c9f6ad55a54b6b6bbef579d3 Mar 20 11:09:14 crc kubenswrapper[4772]: I0320 11:09:14.515446 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc"] Mar 20 11:09:15 crc kubenswrapper[4772]: I0320 11:09:15.322833 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" event={"ID":"986d86cc-279b-4511-95b8-10f80268aad4","Type":"ContainerStarted","Data":"905aa7073e73ce6b2b1ac829533bfea71dbb27eac7adf616ef04cb376b7c087a"} Mar 20 11:09:15 crc kubenswrapper[4772]: I0320 11:09:15.324062 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fbbb4bfd-lq6wb" event={"ID":"3be4d7db-c184-45eb-83fb-42b58f936ff4","Type":"ContainerStarted","Data":"09e0a34961fcbb91557b713f2e0100b98bec31b5a2c2399e96cf36da5ca4caae"} Mar 20 11:09:15 crc kubenswrapper[4772]: I0320 11:09:15.324087 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85fbbb4bfd-lq6wb" event={"ID":"3be4d7db-c184-45eb-83fb-42b58f936ff4","Type":"ContainerStarted","Data":"023b63e9b0fbdc1bd0ed6e722af1596a2bf20b5b6bcd68976087535bdbc746d8"} Mar 20 11:09:15 crc kubenswrapper[4772]: I0320 11:09:15.324951 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" event={"ID":"40b08b4b-058f-409a-9a32-d372878de5ad","Type":"ContainerStarted","Data":"3990e1cd9b398db3f15efd68672ee9d1dd6b40b1c9f6ad55a54b6b6bbef579d3"} Mar 20 11:09:15 crc kubenswrapper[4772]: I0320 11:09:15.326611 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" event={"ID":"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded","Type":"ContainerStarted","Data":"0173887cecc4107d344fb60f725976c6bda6666b2d99cc4e256310ca7215def6"} Mar 20 11:09:15 crc kubenswrapper[4772]: I0320 11:09:15.340425 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85fbbb4bfd-lq6wb" podStartSLOduration=2.340403171 podStartE2EDuration="2.340403171s" podCreationTimestamp="2026-03-20 11:09:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:09:15.339320922 +0000 UTC m=+841.430287407" watchObservedRunningTime="2026-03-20 11:09:15.340403171 +0000 UTC m=+841.431369656" Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.351037 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" event={"ID":"986d86cc-279b-4511-95b8-10f80268aad4","Type":"ContainerStarted","Data":"6811cb0eb5d3cc59fdefcbcf37168bc68f2c0832a69ea1e052992a62498ca494"} Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.353376 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" event={"ID":"40b08b4b-058f-409a-9a32-d372878de5ad","Type":"ContainerStarted","Data":"1653d52960118e4a858345eb541cb6dfa216a978608fe45e1023672664a36d24"} Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.357885 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-handler-8zk97" event={"ID":"097b90ac-6ee9-4601-9a4d-db33981b1878","Type":"ContainerStarted","Data":"6d142a18ab39887f8a7d53d254efaf928bae4bf2f4c3c829ecfb9d31abb0bfd4"} Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.357974 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.360369 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" event={"ID":"bc2c37a7-9314-4e8d-9e9e-3e98cb73aded","Type":"ContainerStarted","Data":"bec6e4eae1f809edbc2783d02e6e699b6cd4ad2f868a43e77e743c1e1020dc46"} Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.360485 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.409363 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-console-plugin-86f58fcf4-gp8fc" podStartSLOduration=1.8645663319999999 podStartE2EDuration="4.409339253s" podCreationTimestamp="2026-03-20 11:09:13 +0000 UTC" firstStartedPulling="2026-03-20 11:09:14.518316118 +0000 UTC m=+840.609282603" lastFinishedPulling="2026-03-20 11:09:17.063089039 +0000 UTC m=+843.154055524" observedRunningTime="2026-03-20 11:09:17.370679165 +0000 UTC m=+843.461645650" watchObservedRunningTime="2026-03-20 11:09:17.409339253 +0000 UTC m=+843.500305748" Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.416304 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" podStartSLOduration=2.722692179 podStartE2EDuration="4.416277116s" podCreationTimestamp="2026-03-20 11:09:13 +0000 UTC" firstStartedPulling="2026-03-20 11:09:14.440069902 +0000 UTC m=+840.531036387" lastFinishedPulling="2026-03-20 11:09:16.133654839 +0000 UTC m=+842.224621324" observedRunningTime="2026-03-20 11:09:17.414972682 +0000 UTC m=+843.505939177" watchObservedRunningTime="2026-03-20 11:09:17.416277116 +0000 UTC m=+843.507243611" Mar 20 11:09:17 crc kubenswrapper[4772]: I0320 11:09:17.432751 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-handler-8zk97" podStartSLOduration=2.322501497 podStartE2EDuration="4.43273678s" podCreationTimestamp="2026-03-20 11:09:13 +0000 UTC" firstStartedPulling="2026-03-20 11:09:14.022582474 +0000 UTC m=+840.113548959" lastFinishedPulling="2026-03-20 11:09:16.132817757 +0000 UTC m=+842.223784242" observedRunningTime="2026-03-20 11:09:17.430372777 +0000 UTC m=+843.521339262" watchObservedRunningTime="2026-03-20 11:09:17.43273678 +0000 UTC m=+843.523703265" Mar 20 11:09:19 crc kubenswrapper[4772]: I0320 11:09:19.376544 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" event={"ID":"986d86cc-279b-4511-95b8-10f80268aad4","Type":"ContainerStarted","Data":"7cc17e9c4247d86e9c1823c0162bf9a13fcdd57c317d9fe45c4958bdb9d73c79"} Mar 20 11:09:19 crc kubenswrapper[4772]: I0320 11:09:19.400210 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-nmstate/nmstate-metrics-9b8c8685d-jx96w" podStartSLOduration=2.291989711 podStartE2EDuration="6.40018089s" podCreationTimestamp="2026-03-20 11:09:13 +0000 UTC" firstStartedPulling="2026-03-20 11:09:14.395035787 +0000 UTC m=+840.486002272" lastFinishedPulling="2026-03-20 11:09:18.503226966 +0000 UTC m=+844.594193451" observedRunningTime="2026-03-20 11:09:19.398669631 +0000 UTC m=+845.489636156" watchObservedRunningTime="2026-03-20 11:09:19.40018089 +0000 UTC m=+845.491147445" Mar 20 11:09:24 crc kubenswrapper[4772]: I0320 11:09:24.015704 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-handler-8zk97" Mar 20 11:09:24 crc kubenswrapper[4772]: I0320 11:09:24.298994 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:24 crc kubenswrapper[4772]: I0320 11:09:24.299274 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:24 crc kubenswrapper[4772]: I0320 11:09:24.303407 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:24 crc kubenswrapper[4772]: I0320 11:09:24.413253 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85fbbb4bfd-lq6wb" Mar 20 11:09:24 crc kubenswrapper[4772]: I0320 11:09:24.470846 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-fgwgm"] Mar 20 11:09:33 crc kubenswrapper[4772]: I0320 11:09:33.985002 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-nmstate/nmstate-webhook-5f558f5558-cclbs" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.098982 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-g2cvm"] Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.100309 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.112164 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g2cvm"] Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.194824 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7m9fp\" (UniqueName: \"kubernetes.io/projected/382041b3-6cb3-484d-b50e-4d8475efd29f-kube-api-access-7m9fp\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.194954 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-catalog-content\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.195014 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-utilities\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.295982 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-utilities\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.296049 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7m9fp\" (UniqueName: \"kubernetes.io/projected/382041b3-6cb3-484d-b50e-4d8475efd29f-kube-api-access-7m9fp\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.296103 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-catalog-content\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.296563 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-utilities\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.296642 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-catalog-content\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.319692 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7m9fp\" (UniqueName: \"kubernetes.io/projected/382041b3-6cb3-484d-b50e-4d8475efd29f-kube-api-access-7m9fp\") pod \"community-operators-g2cvm\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.425384 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:34 crc kubenswrapper[4772]: I0320 11:09:34.929825 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-g2cvm"] Mar 20 11:09:34 crc kubenswrapper[4772]: W0320 11:09:34.944107 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod382041b3_6cb3_484d_b50e_4d8475efd29f.slice/crio-7eb818fe49ccf3779318c09ba1bcb93523e9828e75895404ae33036cfada352c WatchSource:0}: Error finding container 7eb818fe49ccf3779318c09ba1bcb93523e9828e75895404ae33036cfada352c: Status 404 returned error can't find the container with id 7eb818fe49ccf3779318c09ba1bcb93523e9828e75895404ae33036cfada352c Mar 20 11:09:35 crc kubenswrapper[4772]: I0320 11:09:35.484828 4772 generic.go:334] "Generic (PLEG): container finished" podID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerID="78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123" exitCode=0 Mar 20 11:09:35 crc kubenswrapper[4772]: I0320 11:09:35.485034 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerDied","Data":"78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123"} Mar 20 11:09:35 crc kubenswrapper[4772]: I0320 11:09:35.485173 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerStarted","Data":"7eb818fe49ccf3779318c09ba1bcb93523e9828e75895404ae33036cfada352c"} Mar 20 11:09:36 crc kubenswrapper[4772]: I0320 11:09:36.491039 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerStarted","Data":"863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61"} Mar 20 11:09:37 crc kubenswrapper[4772]: I0320 11:09:37.499573 4772 generic.go:334] "Generic (PLEG): container finished" podID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerID="863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61" exitCode=0 Mar 20 11:09:37 crc kubenswrapper[4772]: I0320 11:09:37.499684 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerDied","Data":"863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61"} Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.513672 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerStarted","Data":"c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987"} Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.534943 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-g2cvm" podStartSLOduration=2.570345105 podStartE2EDuration="5.534925601s" podCreationTimestamp="2026-03-20 11:09:34 +0000 UTC" firstStartedPulling="2026-03-20 11:09:35.488635153 +0000 UTC m=+861.579601638" lastFinishedPulling="2026-03-20 11:09:38.453215649 +0000 UTC m=+864.544182134" observedRunningTime="2026-03-20 11:09:39.528537442 +0000 UTC m=+865.619503947" watchObservedRunningTime="2026-03-20 11:09:39.534925601 +0000 UTC m=+865.625892086" Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.564324 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.564390 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.564435 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.564880 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"c031238f25d43745bddff1c50d95ad51119ab5adc3b084d1a2d3a9cfa70802a1"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:09:39 crc kubenswrapper[4772]: I0320 11:09:39.564927 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://c031238f25d43745bddff1c50d95ad51119ab5adc3b084d1a2d3a9cfa70802a1" gracePeriod=600 Mar 20 11:09:40 crc kubenswrapper[4772]: I0320 11:09:40.524468 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="c031238f25d43745bddff1c50d95ad51119ab5adc3b084d1a2d3a9cfa70802a1" exitCode=0 Mar 20 11:09:40 crc kubenswrapper[4772]: I0320 11:09:40.524545 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"c031238f25d43745bddff1c50d95ad51119ab5adc3b084d1a2d3a9cfa70802a1"} Mar 20 11:09:40 crc kubenswrapper[4772]: I0320 11:09:40.525659 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"3a2dc425dd346ae424a2a128cb64ede7d6abbbfbc7a26799f2508db56e373109"} Mar 20 11:09:40 crc kubenswrapper[4772]: I0320 11:09:40.525678 4772 scope.go:117] "RemoveContainer" containerID="2a850ae2fa64972c823328c8fe9588b84861e2acf1ca840a7222809a1aa0c1ff" Mar 20 11:09:44 crc kubenswrapper[4772]: I0320 11:09:44.426371 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:44 crc kubenswrapper[4772]: I0320 11:09:44.427146 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:44 crc kubenswrapper[4772]: I0320 11:09:44.471101 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:44 crc kubenswrapper[4772]: I0320 11:09:44.594416 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:44 crc kubenswrapper[4772]: I0320 11:09:44.701983 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g2cvm"] Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.351187 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x"] Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.352425 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.354501 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.361733 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x"] Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.453447 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fl2mb\" (UniqueName: \"kubernetes.io/projected/860d09d3-69c4-44e1-9756-cbd62cdd94cc-kube-api-access-fl2mb\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.453520 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.453588 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.555407 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fl2mb\" (UniqueName: \"kubernetes.io/projected/860d09d3-69c4-44e1-9756-cbd62cdd94cc-kube-api-access-fl2mb\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.555912 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.556586 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-util\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.556673 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.556976 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-bundle\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.565416 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-g2cvm" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="registry-server" containerID="cri-o://c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987" gracePeriod=2 Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.579549 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fl2mb\" (UniqueName: \"kubernetes.io/projected/860d09d3-69c4-44e1-9756-cbd62cdd94cc-kube-api-access-fl2mb\") pod \"2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.668332 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.897864 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x"] Mar 20 11:09:46 crc kubenswrapper[4772]: I0320 11:09:46.920744 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.065188 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-utilities\") pod \"382041b3-6cb3-484d-b50e-4d8475efd29f\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.065240 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7m9fp\" (UniqueName: \"kubernetes.io/projected/382041b3-6cb3-484d-b50e-4d8475efd29f-kube-api-access-7m9fp\") pod \"382041b3-6cb3-484d-b50e-4d8475efd29f\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.065296 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-catalog-content\") pod \"382041b3-6cb3-484d-b50e-4d8475efd29f\" (UID: \"382041b3-6cb3-484d-b50e-4d8475efd29f\") " Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.066353 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-utilities" (OuterVolumeSpecName: "utilities") pod "382041b3-6cb3-484d-b50e-4d8475efd29f" (UID: "382041b3-6cb3-484d-b50e-4d8475efd29f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.070223 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/382041b3-6cb3-484d-b50e-4d8475efd29f-kube-api-access-7m9fp" (OuterVolumeSpecName: "kube-api-access-7m9fp") pod "382041b3-6cb3-484d-b50e-4d8475efd29f" (UID: "382041b3-6cb3-484d-b50e-4d8475efd29f"). InnerVolumeSpecName "kube-api-access-7m9fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.166250 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7m9fp\" (UniqueName: \"kubernetes.io/projected/382041b3-6cb3-484d-b50e-4d8475efd29f-kube-api-access-7m9fp\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.166283 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.274524 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "382041b3-6cb3-484d-b50e-4d8475efd29f" (UID: "382041b3-6cb3-484d-b50e-4d8475efd29f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.369442 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/382041b3-6cb3-484d-b50e-4d8475efd29f-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.572619 4772 generic.go:334] "Generic (PLEG): container finished" podID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerID="c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987" exitCode=0 Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.572681 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-g2cvm" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.572702 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerDied","Data":"c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987"} Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.572731 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-g2cvm" event={"ID":"382041b3-6cb3-484d-b50e-4d8475efd29f","Type":"ContainerDied","Data":"7eb818fe49ccf3779318c09ba1bcb93523e9828e75895404ae33036cfada352c"} Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.572752 4772 scope.go:117] "RemoveContainer" containerID="c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.574836 4772 generic.go:334] "Generic (PLEG): container finished" podID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerID="cff5febe648389daea483822aa5025c7ab26a5f092d70477c16f5309343c108b" exitCode=0 Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.574901 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" event={"ID":"860d09d3-69c4-44e1-9756-cbd62cdd94cc","Type":"ContainerDied","Data":"cff5febe648389daea483822aa5025c7ab26a5f092d70477c16f5309343c108b"} Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.574928 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" event={"ID":"860d09d3-69c4-44e1-9756-cbd62cdd94cc","Type":"ContainerStarted","Data":"03e5c78b38e836ae7267c95bce77c72ccf024e34398799393edd21a7ec5ad066"} Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.611231 4772 scope.go:117] "RemoveContainer" containerID="863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.616461 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-g2cvm"] Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.622824 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-g2cvm"] Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.644024 4772 scope.go:117] "RemoveContainer" containerID="78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.669080 4772 scope.go:117] "RemoveContainer" containerID="c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987" Mar 20 11:09:47 crc kubenswrapper[4772]: E0320 11:09:47.669491 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987\": container with ID starting with c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987 not found: ID does not exist" containerID="c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.669534 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987"} err="failed to get container status \"c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987\": rpc error: code = NotFound desc = could not find container \"c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987\": container with ID starting with c629c7d93c2b747c8386245ca648dc388c059a3bdf8d53e2a93d57cc148fb987 not found: ID does not exist" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.669565 4772 scope.go:117] "RemoveContainer" containerID="863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61" Mar 20 11:09:47 crc kubenswrapper[4772]: E0320 11:09:47.670045 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61\": container with ID starting with 863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61 not found: ID does not exist" containerID="863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.670082 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61"} err="failed to get container status \"863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61\": rpc error: code = NotFound desc = could not find container \"863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61\": container with ID starting with 863a4c089cf6d49d346d9475c5ac83bf49010855843c5bc07362a9f496df7c61 not found: ID does not exist" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.670108 4772 scope.go:117] "RemoveContainer" containerID="78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123" Mar 20 11:09:47 crc kubenswrapper[4772]: E0320 11:09:47.670505 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123\": container with ID starting with 78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123 not found: ID does not exist" containerID="78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123" Mar 20 11:09:47 crc kubenswrapper[4772]: I0320 11:09:47.670544 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123"} err="failed to get container status \"78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123\": rpc error: code = NotFound desc = could not find container \"78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123\": container with ID starting with 78df8b366295af38168d64d800ca0ee63ebdf4e20cf8ec4852291dc887b96123 not found: ID does not exist" Mar 20 11:09:48 crc kubenswrapper[4772]: I0320 11:09:48.648873 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" path="/var/lib/kubelet/pods/382041b3-6cb3-484d-b50e-4d8475efd29f/volumes" Mar 20 11:09:49 crc kubenswrapper[4772]: I0320 11:09:49.508854 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-console/console-f9d7485db-fgwgm" podUID="f7c20397-4233-45e6-a7f9-5e88942e7abf" containerName="console" containerID="cri-o://5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974" gracePeriod=15 Mar 20 11:09:49 crc kubenswrapper[4772]: I0320 11:09:49.589610 4772 generic.go:334] "Generic (PLEG): container finished" podID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerID="32fa6c0bfa9efbe6f3e73a75796060417572f662a1a04f6183619d14518cd636" exitCode=0 Mar 20 11:09:49 crc kubenswrapper[4772]: I0320 11:09:49.589653 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" event={"ID":"860d09d3-69c4-44e1-9756-cbd62cdd94cc","Type":"ContainerDied","Data":"32fa6c0bfa9efbe6f3e73a75796060417572f662a1a04f6183619d14518cd636"} Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.495006 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-fgwgm_f7c20397-4233-45e6-a7f9-5e88942e7abf/console/0.log" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.495306 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.604926 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-f9d7485db-fgwgm_f7c20397-4233-45e6-a7f9-5e88942e7abf/console/0.log" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.604978 4772 generic.go:334] "Generic (PLEG): container finished" podID="f7c20397-4233-45e6-a7f9-5e88942e7abf" containerID="5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974" exitCode=2 Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.605036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fgwgm" event={"ID":"f7c20397-4233-45e6-a7f9-5e88942e7abf","Type":"ContainerDied","Data":"5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974"} Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.605066 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-fgwgm" event={"ID":"f7c20397-4233-45e6-a7f9-5e88942e7abf","Type":"ContainerDied","Data":"992c5443a50bbbf5c4d7a2cd973bfdefb8244e6d96035d5e04e58f9cfe3739c0"} Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.605086 4772 scope.go:117] "RemoveContainer" containerID="5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.605200 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-fgwgm" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.610399 4772 generic.go:334] "Generic (PLEG): container finished" podID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerID="db14944e050efc1d514724bccd4a8c65bbb9778d35238b3aaea9e897efda8321" exitCode=0 Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.610446 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" event={"ID":"860d09d3-69c4-44e1-9756-cbd62cdd94cc","Type":"ContainerDied","Data":"db14944e050efc1d514724bccd4a8c65bbb9778d35238b3aaea9e897efda8321"} Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613267 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-service-ca\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613367 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-oauth-serving-cert\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613405 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-config\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613448 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-serving-cert\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613476 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-trusted-ca-bundle\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vf7zg\" (UniqueName: \"kubernetes.io/projected/f7c20397-4233-45e6-a7f9-5e88942e7abf-kube-api-access-vf7zg\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.613568 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-oauth-config\") pod \"f7c20397-4233-45e6-a7f9-5e88942e7abf\" (UID: \"f7c20397-4233-45e6-a7f9-5e88942e7abf\") " Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.615103 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-service-ca" (OuterVolumeSpecName: "service-ca") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.615835 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.615900 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-config" (OuterVolumeSpecName: "console-config") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.616539 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.620863 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.621571 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.621640 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f7c20397-4233-45e6-a7f9-5e88942e7abf-kube-api-access-vf7zg" (OuterVolumeSpecName: "kube-api-access-vf7zg") pod "f7c20397-4233-45e6-a7f9-5e88942e7abf" (UID: "f7c20397-4233-45e6-a7f9-5e88942e7abf"). InnerVolumeSpecName "kube-api-access-vf7zg". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.625979 4772 scope.go:117] "RemoveContainer" containerID="5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974" Mar 20 11:09:50 crc kubenswrapper[4772]: E0320 11:09:50.626564 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974\": container with ID starting with 5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974 not found: ID does not exist" containerID="5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.626596 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974"} err="failed to get container status \"5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974\": rpc error: code = NotFound desc = could not find container \"5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974\": container with ID starting with 5a671a4b4a0e7c8d24fd731b64261b266046340ea2018cce1bc89c80b03a2974 not found: ID does not exist" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715234 4772 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715274 4772 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715283 4772 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-serving-cert\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715293 4772 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715301 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vf7zg\" (UniqueName: \"kubernetes.io/projected/f7c20397-4233-45e6-a7f9-5e88942e7abf-kube-api-access-vf7zg\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715310 4772 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/f7c20397-4233-45e6-a7f9-5e88942e7abf-console-oauth-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.715319 4772 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/f7c20397-4233-45e6-a7f9-5e88942e7abf-service-ca\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.921620 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-console/console-f9d7485db-fgwgm"] Mar 20 11:09:50 crc kubenswrapper[4772]: I0320 11:09:50.925468 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-f9d7485db-fgwgm"] Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.836632 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.930268 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-util\") pod \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.930667 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-bundle\") pod \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.930702 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fl2mb\" (UniqueName: \"kubernetes.io/projected/860d09d3-69c4-44e1-9756-cbd62cdd94cc-kube-api-access-fl2mb\") pod \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\" (UID: \"860d09d3-69c4-44e1-9756-cbd62cdd94cc\") " Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.931564 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-bundle" (OuterVolumeSpecName: "bundle") pod "860d09d3-69c4-44e1-9756-cbd62cdd94cc" (UID: "860d09d3-69c4-44e1-9756-cbd62cdd94cc"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.934622 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/860d09d3-69c4-44e1-9756-cbd62cdd94cc-kube-api-access-fl2mb" (OuterVolumeSpecName: "kube-api-access-fl2mb") pod "860d09d3-69c4-44e1-9756-cbd62cdd94cc" (UID: "860d09d3-69c4-44e1-9756-cbd62cdd94cc"). InnerVolumeSpecName "kube-api-access-fl2mb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:09:51 crc kubenswrapper[4772]: I0320 11:09:51.947878 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-util" (OuterVolumeSpecName: "util") pod "860d09d3-69c4-44e1-9756-cbd62cdd94cc" (UID: "860d09d3-69c4-44e1-9756-cbd62cdd94cc"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.032258 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.032291 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fl2mb\" (UniqueName: \"kubernetes.io/projected/860d09d3-69c4-44e1-9756-cbd62cdd94cc-kube-api-access-fl2mb\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.032302 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/860d09d3-69c4-44e1-9756-cbd62cdd94cc-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.625177 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" event={"ID":"860d09d3-69c4-44e1-9756-cbd62cdd94cc","Type":"ContainerDied","Data":"03e5c78b38e836ae7267c95bce77c72ccf024e34398799393edd21a7ec5ad066"} Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.625222 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="03e5c78b38e836ae7267c95bce77c72ccf024e34398799393edd21a7ec5ad066" Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.625245 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x" Mar 20 11:09:52 crc kubenswrapper[4772]: I0320 11:09:52.648102 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f7c20397-4233-45e6-a7f9-5e88942e7abf" path="/var/lib/kubelet/pods/f7c20397-4233-45e6-a7f9-5e88942e7abf/volumes" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.123244 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566750-vw5dq"] Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.123947 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="extract" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.123960 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="extract" Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.123971 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="registry-server" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.123977 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="registry-server" Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.123985 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="extract-content" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.123990 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="extract-content" Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.124002 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="pull" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124008 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="pull" Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.124015 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="extract-utilities" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124022 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="extract-utilities" Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.124031 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f7c20397-4233-45e6-a7f9-5e88942e7abf" containerName="console" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124036 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="f7c20397-4233-45e6-a7f9-5e88942e7abf" containerName="console" Mar 20 11:10:00 crc kubenswrapper[4772]: E0320 11:10:00.124045 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="util" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124051 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="util" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124155 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="860d09d3-69c4-44e1-9756-cbd62cdd94cc" containerName="extract" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124166 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="f7c20397-4233-45e6-a7f9-5e88942e7abf" containerName="console" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124179 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="382041b3-6cb3-484d-b50e-4d8475efd29f" containerName="registry-server" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.124671 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.130447 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.130670 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.130762 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.132545 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-vw5dq"] Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.234217 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r9h6z\" (UniqueName: \"kubernetes.io/projected/8405588b-3a72-4eb6-8223-277ba942d2b1-kube-api-access-r9h6z\") pod \"auto-csr-approver-29566750-vw5dq\" (UID: \"8405588b-3a72-4eb6-8223-277ba942d2b1\") " pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.335432 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-r9h6z\" (UniqueName: \"kubernetes.io/projected/8405588b-3a72-4eb6-8223-277ba942d2b1-kube-api-access-r9h6z\") pod \"auto-csr-approver-29566750-vw5dq\" (UID: \"8405588b-3a72-4eb6-8223-277ba942d2b1\") " pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.360826 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-r9h6z\" (UniqueName: \"kubernetes.io/projected/8405588b-3a72-4eb6-8223-277ba942d2b1-kube-api-access-r9h6z\") pod \"auto-csr-approver-29566750-vw5dq\" (UID: \"8405588b-3a72-4eb6-8223-277ba942d2b1\") " pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.442169 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8"] Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.443035 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.443135 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.445291 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-xbg58" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.445763 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.445868 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.446404 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.446408 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.458342 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8"] Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.537496 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/344e34de-aef3-4ac6-9492-e6d359b5966d-webhook-cert\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.537560 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/344e34de-aef3-4ac6-9492-e6d359b5966d-apiservice-cert\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.537784 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxl6m\" (UniqueName: \"kubernetes.io/projected/344e34de-aef3-4ac6-9492-e6d359b5966d-kube-api-access-jxl6m\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.641411 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/344e34de-aef3-4ac6-9492-e6d359b5966d-apiservice-cert\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.641753 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxl6m\" (UniqueName: \"kubernetes.io/projected/344e34de-aef3-4ac6-9492-e6d359b5966d-kube-api-access-jxl6m\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.641796 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/344e34de-aef3-4ac6-9492-e6d359b5966d-webhook-cert\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.646393 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/344e34de-aef3-4ac6-9492-e6d359b5966d-webhook-cert\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.661807 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/344e34de-aef3-4ac6-9492-e6d359b5966d-apiservice-cert\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.675493 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxl6m\" (UniqueName: \"kubernetes.io/projected/344e34de-aef3-4ac6-9492-e6d359b5966d-kube-api-access-jxl6m\") pod \"metallb-operator-controller-manager-6f47d558c9-8f9x8\" (UID: \"344e34de-aef3-4ac6-9492-e6d359b5966d\") " pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.795743 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5"] Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.796524 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.798134 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.798409 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-b2mpc" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.798631 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.803849 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.810619 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5"] Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.844644 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b117c6bf-28df-4a1b-bda6-b96dc51f6531-apiservice-cert\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.844694 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b117c6bf-28df-4a1b-bda6-b96dc51f6531-webhook-cert\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.844728 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bhzl2\" (UniqueName: \"kubernetes.io/projected/b117c6bf-28df-4a1b-bda6-b96dc51f6531-kube-api-access-bhzl2\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.890086 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-vw5dq"] Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.945697 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b117c6bf-28df-4a1b-bda6-b96dc51f6531-apiservice-cert\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.945755 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b117c6bf-28df-4a1b-bda6-b96dc51f6531-webhook-cert\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.945801 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bhzl2\" (UniqueName: \"kubernetes.io/projected/b117c6bf-28df-4a1b-bda6-b96dc51f6531-kube-api-access-bhzl2\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.953398 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/b117c6bf-28df-4a1b-bda6-b96dc51f6531-apiservice-cert\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.952723 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/b117c6bf-28df-4a1b-bda6-b96dc51f6531-webhook-cert\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:00 crc kubenswrapper[4772]: I0320 11:10:00.977379 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bhzl2\" (UniqueName: \"kubernetes.io/projected/b117c6bf-28df-4a1b-bda6-b96dc51f6531-kube-api-access-bhzl2\") pod \"metallb-operator-webhook-server-5bcf94d488-gf9n5\" (UID: \"b117c6bf-28df-4a1b-bda6-b96dc51f6531\") " pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:01 crc kubenswrapper[4772]: I0320 11:10:01.065807 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8"] Mar 20 11:10:01 crc kubenswrapper[4772]: I0320 11:10:01.111134 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:01 crc kubenswrapper[4772]: I0320 11:10:01.297338 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5"] Mar 20 11:10:01 crc kubenswrapper[4772]: W0320 11:10:01.306052 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb117c6bf_28df_4a1b_bda6_b96dc51f6531.slice/crio-8b3f542f6a9c5fec81a3546e66016ddd96ec5a325426ba5047a108ab7ffccac1 WatchSource:0}: Error finding container 8b3f542f6a9c5fec81a3546e66016ddd96ec5a325426ba5047a108ab7ffccac1: Status 404 returned error can't find the container with id 8b3f542f6a9c5fec81a3546e66016ddd96ec5a325426ba5047a108ab7ffccac1 Mar 20 11:10:01 crc kubenswrapper[4772]: I0320 11:10:01.676239 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" event={"ID":"b117c6bf-28df-4a1b-bda6-b96dc51f6531","Type":"ContainerStarted","Data":"8b3f542f6a9c5fec81a3546e66016ddd96ec5a325426ba5047a108ab7ffccac1"} Mar 20 11:10:01 crc kubenswrapper[4772]: I0320 11:10:01.677364 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" event={"ID":"344e34de-aef3-4ac6-9492-e6d359b5966d","Type":"ContainerStarted","Data":"a387e683bd58ebf7e55008e83105e270efc03cc08b45cc8904f5e68b5f4be8d9"} Mar 20 11:10:01 crc kubenswrapper[4772]: I0320 11:10:01.678744 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" event={"ID":"8405588b-3a72-4eb6-8223-277ba942d2b1","Type":"ContainerStarted","Data":"2681ffa976f1cdc651851cdeb4ad2ab0f4cba211332ef5ce7012f5ead4eee1df"} Mar 20 11:10:02 crc kubenswrapper[4772]: I0320 11:10:02.688036 4772 generic.go:334] "Generic (PLEG): container finished" podID="8405588b-3a72-4eb6-8223-277ba942d2b1" containerID="3d1ab177438067585a87dc1f86da415365ecd5a7c064ac8d0005e88c9d3dd16b" exitCode=0 Mar 20 11:10:02 crc kubenswrapper[4772]: I0320 11:10:02.688097 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" event={"ID":"8405588b-3a72-4eb6-8223-277ba942d2b1","Type":"ContainerDied","Data":"3d1ab177438067585a87dc1f86da415365ecd5a7c064ac8d0005e88c9d3dd16b"} Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.624445 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.700312 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r9h6z\" (UniqueName: \"kubernetes.io/projected/8405588b-3a72-4eb6-8223-277ba942d2b1-kube-api-access-r9h6z\") pod \"8405588b-3a72-4eb6-8223-277ba942d2b1\" (UID: \"8405588b-3a72-4eb6-8223-277ba942d2b1\") " Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.713013 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8405588b-3a72-4eb6-8223-277ba942d2b1-kube-api-access-r9h6z" (OuterVolumeSpecName: "kube-api-access-r9h6z") pod "8405588b-3a72-4eb6-8223-277ba942d2b1" (UID: "8405588b-3a72-4eb6-8223-277ba942d2b1"). InnerVolumeSpecName "kube-api-access-r9h6z". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.730109 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" event={"ID":"8405588b-3a72-4eb6-8223-277ba942d2b1","Type":"ContainerDied","Data":"2681ffa976f1cdc651851cdeb4ad2ab0f4cba211332ef5ce7012f5ead4eee1df"} Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.730151 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="2681ffa976f1cdc651851cdeb4ad2ab0f4cba211332ef5ce7012f5ead4eee1df" Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.730208 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566750-vw5dq" Mar 20 11:10:04 crc kubenswrapper[4772]: I0320 11:10:04.802240 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-r9h6z\" (UniqueName: \"kubernetes.io/projected/8405588b-3a72-4eb6-8223-277ba942d2b1-kube-api-access-r9h6z\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:05 crc kubenswrapper[4772]: I0320 11:10:05.704469 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-pn8jx"] Mar 20 11:10:05 crc kubenswrapper[4772]: I0320 11:10:05.708498 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566744-pn8jx"] Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.648169 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a1b29235-5568-47d2-b736-a9b91dfbab0b" path="/var/lib/kubelet/pods/a1b29235-5568-47d2-b736-a9b91dfbab0b/volumes" Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.751660 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" event={"ID":"b117c6bf-28df-4a1b-bda6-b96dc51f6531","Type":"ContainerStarted","Data":"949bb264090d2b4228f1c5a51aa7f7d2553044ee97d9c47320fe4346b71feb52"} Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.751816 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.753578 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" event={"ID":"344e34de-aef3-4ac6-9492-e6d359b5966d","Type":"ContainerStarted","Data":"7d0c12d34ddc76a2a78784a5f96df9466df2a2b9cfb87ae342f4bbbab017bcac"} Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.753699 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.775046 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" podStartSLOduration=1.754457664 podStartE2EDuration="6.775025014s" podCreationTimestamp="2026-03-20 11:10:00 +0000 UTC" firstStartedPulling="2026-03-20 11:10:01.308887114 +0000 UTC m=+887.399853599" lastFinishedPulling="2026-03-20 11:10:06.329454464 +0000 UTC m=+892.420420949" observedRunningTime="2026-03-20 11:10:06.772005894 +0000 UTC m=+892.862972379" watchObservedRunningTime="2026-03-20 11:10:06.775025014 +0000 UTC m=+892.865991499" Mar 20 11:10:06 crc kubenswrapper[4772]: I0320 11:10:06.798073 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" podStartSLOduration=1.563271897 podStartE2EDuration="6.798055581s" podCreationTimestamp="2026-03-20 11:10:00 +0000 UTC" firstStartedPulling="2026-03-20 11:10:01.078105793 +0000 UTC m=+887.169072278" lastFinishedPulling="2026-03-20 11:10:06.312889477 +0000 UTC m=+892.403855962" observedRunningTime="2026-03-20 11:10:06.795857162 +0000 UTC m=+892.886823647" watchObservedRunningTime="2026-03-20 11:10:06.798055581 +0000 UTC m=+892.889022076" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.376297 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-wmmls"] Mar 20 11:10:18 crc kubenswrapper[4772]: E0320 11:10:18.378078 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8405588b-3a72-4eb6-8223-277ba942d2b1" containerName="oc" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.378161 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8405588b-3a72-4eb6-8223-277ba942d2b1" containerName="oc" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.378319 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8405588b-3a72-4eb6-8223-277ba942d2b1" containerName="oc" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.379118 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.390945 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmmls"] Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.487236 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-catalog-content\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.487312 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9q22q\" (UniqueName: \"kubernetes.io/projected/574aacb2-27e9-481e-9d3e-e04faf42953c-kube-api-access-9q22q\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.487351 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-utilities\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.588291 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9q22q\" (UniqueName: \"kubernetes.io/projected/574aacb2-27e9-481e-9d3e-e04faf42953c-kube-api-access-9q22q\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.588346 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-utilities\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.588409 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-catalog-content\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.588868 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-catalog-content\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.588993 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-utilities\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.609794 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9q22q\" (UniqueName: \"kubernetes.io/projected/574aacb2-27e9-481e-9d3e-e04faf42953c-kube-api-access-9q22q\") pod \"certified-operators-wmmls\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.697006 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:18 crc kubenswrapper[4772]: I0320 11:10:18.988743 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-wmmls"] Mar 20 11:10:19 crc kubenswrapper[4772]: I0320 11:10:19.829712 4772 generic.go:334] "Generic (PLEG): container finished" podID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerID="1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e" exitCode=0 Mar 20 11:10:19 crc kubenswrapper[4772]: I0320 11:10:19.829762 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerDied","Data":"1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e"} Mar 20 11:10:19 crc kubenswrapper[4772]: I0320 11:10:19.829791 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerStarted","Data":"6982534516f5c5a68ecfabc73402e10378521f0924af5110cd3c2f0d1c647636"} Mar 20 11:10:20 crc kubenswrapper[4772]: I0320 11:10:20.837984 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerStarted","Data":"21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470"} Mar 20 11:10:21 crc kubenswrapper[4772]: I0320 11:10:21.116425 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-5bcf94d488-gf9n5" Mar 20 11:10:21 crc kubenswrapper[4772]: I0320 11:10:21.848469 4772 generic.go:334] "Generic (PLEG): container finished" podID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerID="21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470" exitCode=0 Mar 20 11:10:21 crc kubenswrapper[4772]: I0320 11:10:21.848518 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerDied","Data":"21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470"} Mar 20 11:10:22 crc kubenswrapper[4772]: I0320 11:10:22.856297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerStarted","Data":"7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f"} Mar 20 11:10:22 crc kubenswrapper[4772]: I0320 11:10:22.878411 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-wmmls" podStartSLOduration=2.353583762 podStartE2EDuration="4.878393669s" podCreationTimestamp="2026-03-20 11:10:18 +0000 UTC" firstStartedPulling="2026-03-20 11:10:19.831095774 +0000 UTC m=+905.922062259" lastFinishedPulling="2026-03-20 11:10:22.355905671 +0000 UTC m=+908.446872166" observedRunningTime="2026-03-20 11:10:22.876008186 +0000 UTC m=+908.966974671" watchObservedRunningTime="2026-03-20 11:10:22.878393669 +0000 UTC m=+908.969360144" Mar 20 11:10:28 crc kubenswrapper[4772]: I0320 11:10:28.464542 4772 scope.go:117] "RemoveContainer" containerID="06f0d4f436a0ff2ad03b0761fb8607acf4074b20753529a92f495c258e1e122a" Mar 20 11:10:28 crc kubenswrapper[4772]: I0320 11:10:28.697760 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:28 crc kubenswrapper[4772]: I0320 11:10:28.697883 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:28 crc kubenswrapper[4772]: I0320 11:10:28.739600 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:28 crc kubenswrapper[4772]: I0320 11:10:28.961577 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:30 crc kubenswrapper[4772]: I0320 11:10:30.961994 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmmls"] Mar 20 11:10:30 crc kubenswrapper[4772]: I0320 11:10:30.962461 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-wmmls" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="registry-server" containerID="cri-o://7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f" gracePeriod=2 Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.296622 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.361706 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-catalog-content\") pod \"574aacb2-27e9-481e-9d3e-e04faf42953c\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.361872 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9q22q\" (UniqueName: \"kubernetes.io/projected/574aacb2-27e9-481e-9d3e-e04faf42953c-kube-api-access-9q22q\") pod \"574aacb2-27e9-481e-9d3e-e04faf42953c\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.361974 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-utilities\") pod \"574aacb2-27e9-481e-9d3e-e04faf42953c\" (UID: \"574aacb2-27e9-481e-9d3e-e04faf42953c\") " Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.363169 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-utilities" (OuterVolumeSpecName: "utilities") pod "574aacb2-27e9-481e-9d3e-e04faf42953c" (UID: "574aacb2-27e9-481e-9d3e-e04faf42953c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.368587 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/574aacb2-27e9-481e-9d3e-e04faf42953c-kube-api-access-9q22q" (OuterVolumeSpecName: "kube-api-access-9q22q") pod "574aacb2-27e9-481e-9d3e-e04faf42953c" (UID: "574aacb2-27e9-481e-9d3e-e04faf42953c"). InnerVolumeSpecName "kube-api-access-9q22q". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.413413 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "574aacb2-27e9-481e-9d3e-e04faf42953c" (UID: "574aacb2-27e9-481e-9d3e-e04faf42953c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.463173 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.463218 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/574aacb2-27e9-481e-9d3e-e04faf42953c-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.463234 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9q22q\" (UniqueName: \"kubernetes.io/projected/574aacb2-27e9-481e-9d3e-e04faf42953c-kube-api-access-9q22q\") on node \"crc\" DevicePath \"\"" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.931030 4772 generic.go:334] "Generic (PLEG): container finished" podID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerID="7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f" exitCode=0 Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.931081 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerDied","Data":"7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f"} Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.931110 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-wmmls" event={"ID":"574aacb2-27e9-481e-9d3e-e04faf42953c","Type":"ContainerDied","Data":"6982534516f5c5a68ecfabc73402e10378521f0924af5110cd3c2f0d1c647636"} Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.931114 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-wmmls" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.931128 4772 scope.go:117] "RemoveContainer" containerID="7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.948231 4772 scope.go:117] "RemoveContainer" containerID="21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470" Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.968439 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-wmmls"] Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.971905 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-wmmls"] Mar 20 11:10:31 crc kubenswrapper[4772]: I0320 11:10:31.993363 4772 scope.go:117] "RemoveContainer" containerID="1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.013291 4772 scope.go:117] "RemoveContainer" containerID="7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f" Mar 20 11:10:32 crc kubenswrapper[4772]: E0320 11:10:32.013789 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f\": container with ID starting with 7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f not found: ID does not exist" containerID="7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.013934 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f"} err="failed to get container status \"7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f\": rpc error: code = NotFound desc = could not find container \"7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f\": container with ID starting with 7ac60a161fe224feed96822fed832541240c270a9be95a1cd8cc6df87b37d23f not found: ID does not exist" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.013985 4772 scope.go:117] "RemoveContainer" containerID="21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470" Mar 20 11:10:32 crc kubenswrapper[4772]: E0320 11:10:32.014278 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470\": container with ID starting with 21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470 not found: ID does not exist" containerID="21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.014309 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470"} err="failed to get container status \"21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470\": rpc error: code = NotFound desc = could not find container \"21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470\": container with ID starting with 21939f59a1c5f48455870f60d762f577a1c232d6be04c44b531bcdc56949c470 not found: ID does not exist" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.014327 4772 scope.go:117] "RemoveContainer" containerID="1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e" Mar 20 11:10:32 crc kubenswrapper[4772]: E0320 11:10:32.014637 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e\": container with ID starting with 1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e not found: ID does not exist" containerID="1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.014672 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e"} err="failed to get container status \"1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e\": rpc error: code = NotFound desc = could not find container \"1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e\": container with ID starting with 1727d29ecbbefc32a0040d57d29ca514c3a96b0eec4bb40148132b4b6db8e68e not found: ID does not exist" Mar 20 11:10:32 crc kubenswrapper[4772]: I0320 11:10:32.648012 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" path="/var/lib/kubelet/pods/574aacb2-27e9-481e-9d3e-e04faf42953c/volumes" Mar 20 11:10:40 crc kubenswrapper[4772]: I0320 11:10:40.807145 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-6f47d558c9-8f9x8" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.439140 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn"] Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.439339 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="registry-server" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.439353 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="registry-server" Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.439369 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="extract-utilities" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.439376 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="extract-utilities" Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.439388 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="extract-content" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.439394 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="extract-content" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.439500 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="574aacb2-27e9-481e-9d3e-e04faf42953c" containerName="registry-server" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.439860 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.443821 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-g7pwp"] Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.447981 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.451211 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.451248 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-8gbts" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.452470 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.452835 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.459362 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn"] Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.537786 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-jk4jn"] Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.539239 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.540894 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.540931 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.543326 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-7bb4cc7c98-2vhsh"] Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.544650 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.545122 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-wb9dd" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.545203 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.547084 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.559736 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-2vhsh"] Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.592818 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22225148-3160-41cc-b52b-c294e4e51a57-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.592908 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-reloader\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.592936 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b655b584-dc09-480e-8f60-9f7ff0608456-metrics-certs\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.592976 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5rm46\" (UniqueName: \"kubernetes.io/projected/b655b584-dc09-480e-8f60-9f7ff0608456-kube-api-access-5rm46\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.593032 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-frr-conf\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.593061 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-frr-sockets\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.593086 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-metrics\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.593109 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b655b584-dc09-480e-8f60-9f7ff0608456-frr-startup\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.593142 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gz7f\" (UniqueName: \"kubernetes.io/projected/22225148-3160-41cc-b52b-c294e4e51a57-kube-api-access-7gz7f\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694438 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22225148-3160-41cc-b52b-c294e4e51a57-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694483 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-reloader\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694514 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-cert\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694539 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b655b584-dc09-480e-8f60-9f7ff0608456-metrics-certs\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694577 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-metrics-certs\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.694602 4772 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.694695 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/22225148-3160-41cc-b52b-c294e4e51a57-cert podName:22225148-3160-41cc-b52b-c294e4e51a57 nodeName:}" failed. No retries permitted until 2026-03-20 11:10:42.194670668 +0000 UTC m=+928.285637183 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/22225148-3160-41cc-b52b-c294e4e51a57-cert") pod "frr-k8s-webhook-server-bcc4b6f68-jbcbn" (UID: "22225148-3160-41cc-b52b-c294e4e51a57") : secret "frr-k8s-webhook-server-cert" not found Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694609 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5rm46\" (UniqueName: \"kubernetes.io/projected/b655b584-dc09-480e-8f60-9f7ff0608456-kube-api-access-5rm46\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.694971 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695014 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d4433c35-0c82-4feb-aedf-0c617ef9ff25-metallb-excludel2\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695073 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-metrics-certs\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695174 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-frr-conf\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695244 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-frr-sockets\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695290 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-metrics\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695348 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b655b584-dc09-480e-8f60-9f7ff0608456-frr-startup\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695420 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gz7f\" (UniqueName: \"kubernetes.io/projected/22225148-3160-41cc-b52b-c294e4e51a57-kube-api-access-7gz7f\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695454 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g6f76\" (UniqueName: \"kubernetes.io/projected/1438038c-9bda-41f0-afb8-a16406defd25-kube-api-access-g6f76\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695509 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q5fjw\" (UniqueName: \"kubernetes.io/projected/d4433c35-0c82-4feb-aedf-0c617ef9ff25-kube-api-access-q5fjw\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695542 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-reloader\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.695821 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-frr-conf\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.696141 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-frr-sockets\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.696182 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/b655b584-dc09-480e-8f60-9f7ff0608456-metrics\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.696774 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/b655b584-dc09-480e-8f60-9f7ff0608456-frr-startup\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.700788 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b655b584-dc09-480e-8f60-9f7ff0608456-metrics-certs\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.721201 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gz7f\" (UniqueName: \"kubernetes.io/projected/22225148-3160-41cc-b52b-c294e4e51a57-kube-api-access-7gz7f\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.722247 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5rm46\" (UniqueName: \"kubernetes.io/projected/b655b584-dc09-480e-8f60-9f7ff0608456-kube-api-access-5rm46\") pod \"frr-k8s-g7pwp\" (UID: \"b655b584-dc09-480e-8f60-9f7ff0608456\") " pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.765160 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.796472 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g6f76\" (UniqueName: \"kubernetes.io/projected/1438038c-9bda-41f0-afb8-a16406defd25-kube-api-access-g6f76\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.796686 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q5fjw\" (UniqueName: \"kubernetes.io/projected/d4433c35-0c82-4feb-aedf-0c617ef9ff25-kube-api-access-q5fjw\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.796788 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-cert\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.797439 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-metrics-certs\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.798015 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.798143 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d4433c35-0c82-4feb-aedf-0c617ef9ff25-metallb-excludel2\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.798271 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-metrics-certs\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.798167 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.798689 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist podName:d4433c35-0c82-4feb-aedf-0c617ef9ff25 nodeName:}" failed. No retries permitted until 2026-03-20 11:10:42.298671068 +0000 UTC m=+928.389637643 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist") pod "speaker-jk4jn" (UID: "d4433c35-0c82-4feb-aedf-0c617ef9ff25") : secret "metallb-memberlist" not found Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.799052 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/d4433c35-0c82-4feb-aedf-0c617ef9ff25-metallb-excludel2\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.799143 4772 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Mar 20 11:10:41 crc kubenswrapper[4772]: E0320 11:10:41.799178 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-metrics-certs podName:1438038c-9bda-41f0-afb8-a16406defd25 nodeName:}" failed. No retries permitted until 2026-03-20 11:10:42.299166171 +0000 UTC m=+928.390132656 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-metrics-certs") pod "controller-7bb4cc7c98-2vhsh" (UID: "1438038c-9bda-41f0-afb8-a16406defd25") : secret "controller-certs-secret" not found Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.802131 4772 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.804569 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-metrics-certs\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.812418 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-cert\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.815046 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g6f76\" (UniqueName: \"kubernetes.io/projected/1438038c-9bda-41f0-afb8-a16406defd25-kube-api-access-g6f76\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.821258 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q5fjw\" (UniqueName: \"kubernetes.io/projected/d4433c35-0c82-4feb-aedf-0c617ef9ff25-kube-api-access-q5fjw\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:41 crc kubenswrapper[4772]: I0320 11:10:41.991260 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"9c1f2df4b935e1b6a3430d92c9bf8413f2f5a6775a53a17b63c514586665bcc2"} Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.203412 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22225148-3160-41cc-b52b-c294e4e51a57-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.207063 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/22225148-3160-41cc-b52b-c294e4e51a57-cert\") pod \"frr-k8s-webhook-server-bcc4b6f68-jbcbn\" (UID: \"22225148-3160-41cc-b52b-c294e4e51a57\") " pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.305066 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.305312 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-metrics-certs\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:42 crc kubenswrapper[4772]: E0320 11:10:42.306117 4772 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Mar 20 11:10:42 crc kubenswrapper[4772]: E0320 11:10:42.306214 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist podName:d4433c35-0c82-4feb-aedf-0c617ef9ff25 nodeName:}" failed. No retries permitted until 2026-03-20 11:10:43.30619 +0000 UTC m=+929.397156526 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist") pod "speaker-jk4jn" (UID: "d4433c35-0c82-4feb-aedf-0c617ef9ff25") : secret "metallb-memberlist" not found Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.308745 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1438038c-9bda-41f0-afb8-a16406defd25-metrics-certs\") pod \"controller-7bb4cc7c98-2vhsh\" (UID: \"1438038c-9bda-41f0-afb8-a16406defd25\") " pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.355550 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.463774 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.696814 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-7bb4cc7c98-2vhsh"] Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.831297 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn"] Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.998390 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2vhsh" event={"ID":"1438038c-9bda-41f0-afb8-a16406defd25","Type":"ContainerStarted","Data":"6e80694905cf7305cdb556b0309269505e5e47d31599187775e36ef99d6df459"} Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.998444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2vhsh" event={"ID":"1438038c-9bda-41f0-afb8-a16406defd25","Type":"ContainerStarted","Data":"b438e72d246b404b70275d5512e393cafa6b5a6296a4d31ac9a30778a5b53dd0"} Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.998463 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-7bb4cc7c98-2vhsh" event={"ID":"1438038c-9bda-41f0-afb8-a16406defd25","Type":"ContainerStarted","Data":"1f0e9eaa18d30f5575c7f4c056f9f860f6a3d6eaeaeccdd3b5391a9c37059bb9"} Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.998501 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:42 crc kubenswrapper[4772]: I0320 11:10:42.999494 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" event={"ID":"22225148-3160-41cc-b52b-c294e4e51a57","Type":"ContainerStarted","Data":"2b4439449c3e37d6bdb2caa05436821bc132babdc4cdbdf811db8ecd8c5a81e1"} Mar 20 11:10:43 crc kubenswrapper[4772]: I0320 11:10:43.020229 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-7bb4cc7c98-2vhsh" podStartSLOduration=2.020210135 podStartE2EDuration="2.020210135s" podCreationTimestamp="2026-03-20 11:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:10:43.016827786 +0000 UTC m=+929.107794291" watchObservedRunningTime="2026-03-20 11:10:43.020210135 +0000 UTC m=+929.111176620" Mar 20 11:10:43 crc kubenswrapper[4772]: I0320 11:10:43.320736 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:43 crc kubenswrapper[4772]: I0320 11:10:43.326985 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/d4433c35-0c82-4feb-aedf-0c617ef9ff25-memberlist\") pod \"speaker-jk4jn\" (UID: \"d4433c35-0c82-4feb-aedf-0c617ef9ff25\") " pod="metallb-system/speaker-jk4jn" Mar 20 11:10:43 crc kubenswrapper[4772]: I0320 11:10:43.356288 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-jk4jn" Mar 20 11:10:43 crc kubenswrapper[4772]: W0320 11:10:43.373620 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd4433c35_0c82_4feb_aedf_0c617ef9ff25.slice/crio-7792cb40c4213ddfd08736a6aae563ca97a358a689a222f9086cc8194c7b47a3 WatchSource:0}: Error finding container 7792cb40c4213ddfd08736a6aae563ca97a358a689a222f9086cc8194c7b47a3: Status 404 returned error can't find the container with id 7792cb40c4213ddfd08736a6aae563ca97a358a689a222f9086cc8194c7b47a3 Mar 20 11:10:44 crc kubenswrapper[4772]: I0320 11:10:44.011311 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jk4jn" event={"ID":"d4433c35-0c82-4feb-aedf-0c617ef9ff25","Type":"ContainerStarted","Data":"f81b019cdc6476ed4f6659cef95969fbc6fc43be1ffd25a9250a06db016ea8d7"} Mar 20 11:10:44 crc kubenswrapper[4772]: I0320 11:10:44.011378 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jk4jn" event={"ID":"d4433c35-0c82-4feb-aedf-0c617ef9ff25","Type":"ContainerStarted","Data":"4d418eac38639927900c586a255e02aca06013c7889be799f15e1b8822b7a2d3"} Mar 20 11:10:44 crc kubenswrapper[4772]: I0320 11:10:44.011396 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-jk4jn" event={"ID":"d4433c35-0c82-4feb-aedf-0c617ef9ff25","Type":"ContainerStarted","Data":"7792cb40c4213ddfd08736a6aae563ca97a358a689a222f9086cc8194c7b47a3"} Mar 20 11:10:44 crc kubenswrapper[4772]: I0320 11:10:44.011581 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-jk4jn" Mar 20 11:10:44 crc kubenswrapper[4772]: I0320 11:10:44.040802 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-jk4jn" podStartSLOduration=3.040781486 podStartE2EDuration="3.040781486s" podCreationTimestamp="2026-03-20 11:10:41 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:10:44.036881583 +0000 UTC m=+930.127848088" watchObservedRunningTime="2026-03-20 11:10:44.040781486 +0000 UTC m=+930.131747971" Mar 20 11:10:50 crc kubenswrapper[4772]: I0320 11:10:50.050420 4772 generic.go:334] "Generic (PLEG): container finished" podID="b655b584-dc09-480e-8f60-9f7ff0608456" containerID="61d18ff0bf6b4d246c21216516dca10b1cd594582e62e3898d0e97c995c22341" exitCode=0 Mar 20 11:10:50 crc kubenswrapper[4772]: I0320 11:10:50.050478 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerDied","Data":"61d18ff0bf6b4d246c21216516dca10b1cd594582e62e3898d0e97c995c22341"} Mar 20 11:10:50 crc kubenswrapper[4772]: I0320 11:10:50.052281 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" event={"ID":"22225148-3160-41cc-b52b-c294e4e51a57","Type":"ContainerStarted","Data":"6a8d86cabae0eed147fe1944da3a2fadd9c6b5dd264e9245f7a9ad6cbb61509d"} Mar 20 11:10:50 crc kubenswrapper[4772]: I0320 11:10:50.052421 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:10:50 crc kubenswrapper[4772]: I0320 11:10:50.093558 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" podStartSLOduration=2.677177055 podStartE2EDuration="9.093538273s" podCreationTimestamp="2026-03-20 11:10:41 +0000 UTC" firstStartedPulling="2026-03-20 11:10:42.84165926 +0000 UTC m=+928.932625735" lastFinishedPulling="2026-03-20 11:10:49.258020448 +0000 UTC m=+935.348986953" observedRunningTime="2026-03-20 11:10:50.093106592 +0000 UTC m=+936.184073087" watchObservedRunningTime="2026-03-20 11:10:50.093538273 +0000 UTC m=+936.184504758" Mar 20 11:10:51 crc kubenswrapper[4772]: I0320 11:10:51.059721 4772 generic.go:334] "Generic (PLEG): container finished" podID="b655b584-dc09-480e-8f60-9f7ff0608456" containerID="6c6f4bf5af2e05a2e017c68ee77f501457a56de6c3c802c989aff45bbf6dcfda" exitCode=0 Mar 20 11:10:51 crc kubenswrapper[4772]: I0320 11:10:51.059769 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerDied","Data":"6c6f4bf5af2e05a2e017c68ee77f501457a56de6c3c802c989aff45bbf6dcfda"} Mar 20 11:10:52 crc kubenswrapper[4772]: I0320 11:10:52.069896 4772 generic.go:334] "Generic (PLEG): container finished" podID="b655b584-dc09-480e-8f60-9f7ff0608456" containerID="56bff3c2e009d48c586ac73aab73924c7a89f46bff4eeb2457f13fbc702fa771" exitCode=0 Mar 20 11:10:52 crc kubenswrapper[4772]: I0320 11:10:52.070001 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerDied","Data":"56bff3c2e009d48c586ac73aab73924c7a89f46bff4eeb2457f13fbc702fa771"} Mar 20 11:10:52 crc kubenswrapper[4772]: I0320 11:10:52.467900 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-7bb4cc7c98-2vhsh" Mar 20 11:10:53 crc kubenswrapper[4772]: I0320 11:10:53.081273 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"20acbf8b73708915bb218486bcbd1112b3216891429ab7ccf2a3cd3c845649b5"} Mar 20 11:10:53 crc kubenswrapper[4772]: I0320 11:10:53.081318 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"697fe7bc922d0ef2eff4c3d6cd1d8017c0e06b026f715ddf22d070d24b3180d3"} Mar 20 11:10:53 crc kubenswrapper[4772]: I0320 11:10:53.081330 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"d59f4373457b9b12915e0695229436770b07ebd46fe8a93baf143d85589ce77e"} Mar 20 11:10:53 crc kubenswrapper[4772]: I0320 11:10:53.081341 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"2f0578daf558536aa617791cf0ddef9de693ebc47c13c504b868207afcb16d7e"} Mar 20 11:10:53 crc kubenswrapper[4772]: I0320 11:10:53.081352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"6781e4d5862a08a7e870ddc665db78707785e2e32fd359afd7011c8b661d8965"} Mar 20 11:10:53 crc kubenswrapper[4772]: I0320 11:10:53.360160 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-jk4jn" Mar 20 11:10:54 crc kubenswrapper[4772]: I0320 11:10:54.090248 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-g7pwp" event={"ID":"b655b584-dc09-480e-8f60-9f7ff0608456","Type":"ContainerStarted","Data":"cb0bd1cc9d6823b9cc87d3542e51afdbd83af5b43a0e359d411b21350c9aeffa"} Mar 20 11:10:54 crc kubenswrapper[4772]: I0320 11:10:54.091157 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:54 crc kubenswrapper[4772]: I0320 11:10:54.113083 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-g7pwp" podStartSLOduration=5.8082733189999995 podStartE2EDuration="13.113057565s" podCreationTimestamp="2026-03-20 11:10:41 +0000 UTC" firstStartedPulling="2026-03-20 11:10:41.938212555 +0000 UTC m=+928.029179040" lastFinishedPulling="2026-03-20 11:10:49.242996801 +0000 UTC m=+935.333963286" observedRunningTime="2026-03-20 11:10:54.110449827 +0000 UTC m=+940.201416312" watchObservedRunningTime="2026-03-20 11:10:54.113057565 +0000 UTC m=+940.204024050" Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.933878 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-478lh"] Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.935082 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.937205 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.937425 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-index-dockercfg-9znfz" Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.937601 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.995475 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-478lh"] Mar 20 11:10:55 crc kubenswrapper[4772]: I0320 11:10:55.996969 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nm6g\" (UniqueName: \"kubernetes.io/projected/7bf93913-082a-4910-b955-feb0de5f81ea-kube-api-access-5nm6g\") pod \"openstack-operator-index-478lh\" (UID: \"7bf93913-082a-4910-b955-feb0de5f81ea\") " pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:10:56 crc kubenswrapper[4772]: I0320 11:10:56.097829 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5nm6g\" (UniqueName: \"kubernetes.io/projected/7bf93913-082a-4910-b955-feb0de5f81ea-kube-api-access-5nm6g\") pod \"openstack-operator-index-478lh\" (UID: \"7bf93913-082a-4910-b955-feb0de5f81ea\") " pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:10:56 crc kubenswrapper[4772]: I0320 11:10:56.124093 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nm6g\" (UniqueName: \"kubernetes.io/projected/7bf93913-082a-4910-b955-feb0de5f81ea-kube-api-access-5nm6g\") pod \"openstack-operator-index-478lh\" (UID: \"7bf93913-082a-4910-b955-feb0de5f81ea\") " pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:10:56 crc kubenswrapper[4772]: I0320 11:10:56.300737 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:10:56 crc kubenswrapper[4772]: I0320 11:10:56.720676 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-478lh"] Mar 20 11:10:56 crc kubenswrapper[4772]: W0320 11:10:56.722941 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bf93913_082a_4910_b955_feb0de5f81ea.slice/crio-780a1d3ff641e4b37a9680575a0a2aa5629c1ba725077dc272ae583d532558aa WatchSource:0}: Error finding container 780a1d3ff641e4b37a9680575a0a2aa5629c1ba725077dc272ae583d532558aa: Status 404 returned error can't find the container with id 780a1d3ff641e4b37a9680575a0a2aa5629c1ba725077dc272ae583d532558aa Mar 20 11:10:56 crc kubenswrapper[4772]: I0320 11:10:56.766103 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:56 crc kubenswrapper[4772]: I0320 11:10:56.811293 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:10:57 crc kubenswrapper[4772]: I0320 11:10:57.112161 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-478lh" event={"ID":"7bf93913-082a-4910-b955-feb0de5f81ea","Type":"ContainerStarted","Data":"780a1d3ff641e4b37a9680575a0a2aa5629c1ba725077dc272ae583d532558aa"} Mar 20 11:10:59 crc kubenswrapper[4772]: I0320 11:10:59.314478 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-478lh"] Mar 20 11:10:59 crc kubenswrapper[4772]: I0320 11:10:59.922077 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-index-mm84p"] Mar 20 11:10:59 crc kubenswrapper[4772]: I0320 11:10:59.924008 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:10:59 crc kubenswrapper[4772]: I0320 11:10:59.932689 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mm84p"] Mar 20 11:10:59 crc kubenswrapper[4772]: I0320 11:10:59.940270 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddf49\" (UniqueName: \"kubernetes.io/projected/775463b1-27bc-4fe6-81bf-81170d04d6bf-kube-api-access-ddf49\") pod \"openstack-operator-index-mm84p\" (UID: \"775463b1-27bc-4fe6-81bf-81170d04d6bf\") " pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.041103 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddf49\" (UniqueName: \"kubernetes.io/projected/775463b1-27bc-4fe6-81bf-81170d04d6bf-kube-api-access-ddf49\") pod \"openstack-operator-index-mm84p\" (UID: \"775463b1-27bc-4fe6-81bf-81170d04d6bf\") " pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.059889 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddf49\" (UniqueName: \"kubernetes.io/projected/775463b1-27bc-4fe6-81bf-81170d04d6bf-kube-api-access-ddf49\") pod \"openstack-operator-index-mm84p\" (UID: \"775463b1-27bc-4fe6-81bf-81170d04d6bf\") " pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.133427 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-478lh" event={"ID":"7bf93913-082a-4910-b955-feb0de5f81ea","Type":"ContainerStarted","Data":"ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483"} Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.133567 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/openstack-operator-index-478lh" podUID="7bf93913-082a-4910-b955-feb0de5f81ea" containerName="registry-server" containerID="cri-o://ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483" gracePeriod=2 Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.153387 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-478lh" podStartSLOduration=2.711899883 podStartE2EDuration="5.153371455s" podCreationTimestamp="2026-03-20 11:10:55 +0000 UTC" firstStartedPulling="2026-03-20 11:10:56.724939697 +0000 UTC m=+942.815906182" lastFinishedPulling="2026-03-20 11:10:59.166411269 +0000 UTC m=+945.257377754" observedRunningTime="2026-03-20 11:11:00.148712431 +0000 UTC m=+946.239678916" watchObservedRunningTime="2026-03-20 11:11:00.153371455 +0000 UTC m=+946.244337940" Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.240615 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.651217 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-index-mm84p"] Mar 20 11:11:00 crc kubenswrapper[4772]: W0320 11:11:00.662531 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod775463b1_27bc_4fe6_81bf_81170d04d6bf.slice/crio-5f1bfb6d9851b496b88b84deb43949745653861bf07b25d7b6d5bb3501f9a175 WatchSource:0}: Error finding container 5f1bfb6d9851b496b88b84deb43949745653861bf07b25d7b6d5bb3501f9a175: Status 404 returned error can't find the container with id 5f1bfb6d9851b496b88b84deb43949745653861bf07b25d7b6d5bb3501f9a175 Mar 20 11:11:00 crc kubenswrapper[4772]: I0320 11:11:00.982402 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.144676 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mm84p" event={"ID":"775463b1-27bc-4fe6-81bf-81170d04d6bf","Type":"ContainerStarted","Data":"0f35845a9ebe443d06f63e23d8d91795e3f6e4adb90de6592be1cfdc2586fbd4"} Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.144756 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-mm84p" event={"ID":"775463b1-27bc-4fe6-81bf-81170d04d6bf","Type":"ContainerStarted","Data":"5f1bfb6d9851b496b88b84deb43949745653861bf07b25d7b6d5bb3501f9a175"} Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.148461 4772 generic.go:334] "Generic (PLEG): container finished" podID="7bf93913-082a-4910-b955-feb0de5f81ea" containerID="ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483" exitCode=0 Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.148530 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-478lh" event={"ID":"7bf93913-082a-4910-b955-feb0de5f81ea","Type":"ContainerDied","Data":"ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483"} Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.148565 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-index-478lh" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.148570 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-index-478lh" event={"ID":"7bf93913-082a-4910-b955-feb0de5f81ea","Type":"ContainerDied","Data":"780a1d3ff641e4b37a9680575a0a2aa5629c1ba725077dc272ae583d532558aa"} Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.148589 4772 scope.go:117] "RemoveContainer" containerID="ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.157286 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5nm6g\" (UniqueName: \"kubernetes.io/projected/7bf93913-082a-4910-b955-feb0de5f81ea-kube-api-access-5nm6g\") pod \"7bf93913-082a-4910-b955-feb0de5f81ea\" (UID: \"7bf93913-082a-4910-b955-feb0de5f81ea\") " Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.168693 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-index-mm84p" podStartSLOduration=2.054133548 podStartE2EDuration="2.168664447s" podCreationTimestamp="2026-03-20 11:10:59 +0000 UTC" firstStartedPulling="2026-03-20 11:11:00.668663832 +0000 UTC m=+946.759630327" lastFinishedPulling="2026-03-20 11:11:00.783194711 +0000 UTC m=+946.874161226" observedRunningTime="2026-03-20 11:11:01.15810787 +0000 UTC m=+947.249074395" watchObservedRunningTime="2026-03-20 11:11:01.168664447 +0000 UTC m=+947.259630962" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.170069 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bf93913-082a-4910-b955-feb0de5f81ea-kube-api-access-5nm6g" (OuterVolumeSpecName: "kube-api-access-5nm6g") pod "7bf93913-082a-4910-b955-feb0de5f81ea" (UID: "7bf93913-082a-4910-b955-feb0de5f81ea"). InnerVolumeSpecName "kube-api-access-5nm6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.175128 4772 scope.go:117] "RemoveContainer" containerID="ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483" Mar 20 11:11:01 crc kubenswrapper[4772]: E0320 11:11:01.176345 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483\": container with ID starting with ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483 not found: ID does not exist" containerID="ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.176400 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483"} err="failed to get container status \"ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483\": rpc error: code = NotFound desc = could not find container \"ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483\": container with ID starting with ec373f9ad1e852916952ee9fcdc81d2470bdb3eae24df8ebe7f5c00c04b68483 not found: ID does not exist" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.263896 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5nm6g\" (UniqueName: \"kubernetes.io/projected/7bf93913-082a-4910-b955-feb0de5f81ea-kube-api-access-5nm6g\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.478610 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/openstack-operator-index-478lh"] Mar 20 11:11:01 crc kubenswrapper[4772]: I0320 11:11:01.482914 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/openstack-operator-index-478lh"] Mar 20 11:11:02 crc kubenswrapper[4772]: I0320 11:11:02.364811 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-bcc4b6f68-jbcbn" Mar 20 11:11:02 crc kubenswrapper[4772]: I0320 11:11:02.650659 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bf93913-082a-4910-b955-feb0de5f81ea" path="/var/lib/kubelet/pods/7bf93913-082a-4910-b955-feb0de5f81ea/volumes" Mar 20 11:11:10 crc kubenswrapper[4772]: I0320 11:11:10.241535 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:10 crc kubenswrapper[4772]: I0320 11:11:10.242207 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:10 crc kubenswrapper[4772]: I0320 11:11:10.268262 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:11 crc kubenswrapper[4772]: I0320 11:11:11.233938 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-index-mm84p" Mar 20 11:11:11 crc kubenswrapper[4772]: I0320 11:11:11.774234 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-g7pwp" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.382358 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl"] Mar 20 11:11:17 crc kubenswrapper[4772]: E0320 11:11:17.383117 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7bf93913-082a-4910-b955-feb0de5f81ea" containerName="registry-server" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.383130 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bf93913-082a-4910-b955-feb0de5f81ea" containerName="registry-server" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.383228 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7bf93913-082a-4910-b955-feb0de5f81ea" containerName="registry-server" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.383968 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.386691 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-t92h6" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.392267 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl"] Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.469253 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.469731 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.469775 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hddc5\" (UniqueName: \"kubernetes.io/projected/26278da9-8afd-4b22-b53b-dc7334d50643-kube-api-access-hddc5\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.570933 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.570989 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hddc5\" (UniqueName: \"kubernetes.io/projected/26278da9-8afd-4b22-b53b-dc7334d50643-kube-api-access-hddc5\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.571066 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.571497 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-bundle\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.571664 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-util\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.589382 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hddc5\" (UniqueName: \"kubernetes.io/projected/26278da9-8afd-4b22-b53b-dc7334d50643-kube-api-access-hddc5\") pod \"8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:17 crc kubenswrapper[4772]: I0320 11:11:17.708332 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:18 crc kubenswrapper[4772]: I0320 11:11:18.144949 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl"] Mar 20 11:11:18 crc kubenswrapper[4772]: I0320 11:11:18.253262 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" event={"ID":"26278da9-8afd-4b22-b53b-dc7334d50643","Type":"ContainerStarted","Data":"b07ff16e58e9dc7f5e3173eda4c3f7732f2d108b91df91f9343f2e863df30542"} Mar 20 11:11:19 crc kubenswrapper[4772]: I0320 11:11:19.263203 4772 generic.go:334] "Generic (PLEG): container finished" podID="26278da9-8afd-4b22-b53b-dc7334d50643" containerID="75025343883e030497198916b1d753e7819c317459ad63194f24d74bc000f2d9" exitCode=0 Mar 20 11:11:19 crc kubenswrapper[4772]: I0320 11:11:19.263439 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" event={"ID":"26278da9-8afd-4b22-b53b-dc7334d50643","Type":"ContainerDied","Data":"75025343883e030497198916b1d753e7819c317459ad63194f24d74bc000f2d9"} Mar 20 11:11:20 crc kubenswrapper[4772]: I0320 11:11:20.276073 4772 generic.go:334] "Generic (PLEG): container finished" podID="26278da9-8afd-4b22-b53b-dc7334d50643" containerID="1a2cc8b53ff5cd9eab77839c28a0f670b955a85636a857024d17aeefe08deb4e" exitCode=0 Mar 20 11:11:20 crc kubenswrapper[4772]: I0320 11:11:20.276140 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" event={"ID":"26278da9-8afd-4b22-b53b-dc7334d50643","Type":"ContainerDied","Data":"1a2cc8b53ff5cd9eab77839c28a0f670b955a85636a857024d17aeefe08deb4e"} Mar 20 11:11:21 crc kubenswrapper[4772]: I0320 11:11:21.284148 4772 generic.go:334] "Generic (PLEG): container finished" podID="26278da9-8afd-4b22-b53b-dc7334d50643" containerID="e771fa355063b27c5b164100d4fc96857b783c15570c0279093ff412de9cb68d" exitCode=0 Mar 20 11:11:21 crc kubenswrapper[4772]: I0320 11:11:21.284239 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" event={"ID":"26278da9-8afd-4b22-b53b-dc7334d50643","Type":"ContainerDied","Data":"e771fa355063b27c5b164100d4fc96857b783c15570c0279093ff412de9cb68d"} Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.518562 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.549495 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-bundle\") pod \"26278da9-8afd-4b22-b53b-dc7334d50643\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.549571 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hddc5\" (UniqueName: \"kubernetes.io/projected/26278da9-8afd-4b22-b53b-dc7334d50643-kube-api-access-hddc5\") pod \"26278da9-8afd-4b22-b53b-dc7334d50643\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.549651 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-util\") pod \"26278da9-8afd-4b22-b53b-dc7334d50643\" (UID: \"26278da9-8afd-4b22-b53b-dc7334d50643\") " Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.550156 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-bundle" (OuterVolumeSpecName: "bundle") pod "26278da9-8afd-4b22-b53b-dc7334d50643" (UID: "26278da9-8afd-4b22-b53b-dc7334d50643"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.554395 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/26278da9-8afd-4b22-b53b-dc7334d50643-kube-api-access-hddc5" (OuterVolumeSpecName: "kube-api-access-hddc5") pod "26278da9-8afd-4b22-b53b-dc7334d50643" (UID: "26278da9-8afd-4b22-b53b-dc7334d50643"). InnerVolumeSpecName "kube-api-access-hddc5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.562950 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-util" (OuterVolumeSpecName: "util") pod "26278da9-8afd-4b22-b53b-dc7334d50643" (UID: "26278da9-8afd-4b22-b53b-dc7334d50643"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.650796 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hddc5\" (UniqueName: \"kubernetes.io/projected/26278da9-8afd-4b22-b53b-dc7334d50643-kube-api-access-hddc5\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.650831 4772 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-util\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:22 crc kubenswrapper[4772]: I0320 11:11:22.650856 4772 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/26278da9-8afd-4b22-b53b-dc7334d50643-bundle\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:23 crc kubenswrapper[4772]: I0320 11:11:23.306788 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" event={"ID":"26278da9-8afd-4b22-b53b-dc7334d50643","Type":"ContainerDied","Data":"b07ff16e58e9dc7f5e3173eda4c3f7732f2d108b91df91f9343f2e863df30542"} Mar 20 11:11:23 crc kubenswrapper[4772]: I0320 11:11:23.306854 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b07ff16e58e9dc7f5e3173eda4c3f7732f2d108b91df91f9343f2e863df30542" Mar 20 11:11:23 crc kubenswrapper[4772]: I0320 11:11:23.306928 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.104459 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4"] Mar 20 11:11:30 crc kubenswrapper[4772]: E0320 11:11:30.105427 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.105442 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4772]: E0320 11:11:30.105461 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="util" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.105468 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="util" Mar 20 11:11:30 crc kubenswrapper[4772]: E0320 11:11:30.105485 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="pull" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.105493 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="pull" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.105629 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="26278da9-8afd-4b22-b53b-dc7334d50643" containerName="extract" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.106184 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.108199 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-init-dockercfg-t6dns" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.129364 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4"] Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.250070 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dbb9p\" (UniqueName: \"kubernetes.io/projected/571efd1c-abe7-4edc-a3d8-508b2ec30b37-kube-api-access-dbb9p\") pod \"openstack-operator-controller-init-846ffbb776-fc7k4\" (UID: \"571efd1c-abe7-4edc-a3d8-508b2ec30b37\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.351717 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dbb9p\" (UniqueName: \"kubernetes.io/projected/571efd1c-abe7-4edc-a3d8-508b2ec30b37-kube-api-access-dbb9p\") pod \"openstack-operator-controller-init-846ffbb776-fc7k4\" (UID: \"571efd1c-abe7-4edc-a3d8-508b2ec30b37\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.382178 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dbb9p\" (UniqueName: \"kubernetes.io/projected/571efd1c-abe7-4edc-a3d8-508b2ec30b37-kube-api-access-dbb9p\") pod \"openstack-operator-controller-init-846ffbb776-fc7k4\" (UID: \"571efd1c-abe7-4edc-a3d8-508b2ec30b37\") " pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.427295 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:30 crc kubenswrapper[4772]: I0320 11:11:30.904828 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4"] Mar 20 11:11:31 crc kubenswrapper[4772]: I0320 11:11:31.352352 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" event={"ID":"571efd1c-abe7-4edc-a3d8-508b2ec30b37","Type":"ContainerStarted","Data":"b1e4fe77e0b02516d611662eba2dd79c2393ea5e07c12cb44d86ed4df622179f"} Mar 20 11:11:35 crc kubenswrapper[4772]: I0320 11:11:35.380289 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" event={"ID":"571efd1c-abe7-4edc-a3d8-508b2ec30b37","Type":"ContainerStarted","Data":"c97bb7b898e2492b808650b2431fc8272838fd83ed872e33fe32d57aa656d852"} Mar 20 11:11:35 crc kubenswrapper[4772]: I0320 11:11:35.380842 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:35 crc kubenswrapper[4772]: I0320 11:11:35.404767 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" podStartSLOduration=1.813076393 podStartE2EDuration="5.404746521s" podCreationTimestamp="2026-03-20 11:11:30 +0000 UTC" firstStartedPulling="2026-03-20 11:11:30.913250093 +0000 UTC m=+977.004216578" lastFinishedPulling="2026-03-20 11:11:34.504920221 +0000 UTC m=+980.595886706" observedRunningTime="2026-03-20 11:11:35.403199291 +0000 UTC m=+981.494165796" watchObservedRunningTime="2026-03-20 11:11:35.404746521 +0000 UTC m=+981.495713006" Mar 20 11:11:39 crc kubenswrapper[4772]: I0320 11:11:39.564352 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:11:39 crc kubenswrapper[4772]: I0320 11:11:39.566133 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.145530 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-v6sh8"] Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.147214 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.153966 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6sh8"] Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.172021 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5n8x\" (UniqueName: \"kubernetes.io/projected/01ef4996-bc22-42f3-b298-a27dddd23838-kube-api-access-c5n8x\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.172086 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-catalog-content\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.172287 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-utilities\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.273147 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c5n8x\" (UniqueName: \"kubernetes.io/projected/01ef4996-bc22-42f3-b298-a27dddd23838-kube-api-access-c5n8x\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.273198 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-catalog-content\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.273235 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-utilities\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.273784 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-utilities\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.273783 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-catalog-content\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.292513 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5n8x\" (UniqueName: \"kubernetes.io/projected/01ef4996-bc22-42f3-b298-a27dddd23838-kube-api-access-c5n8x\") pod \"redhat-marketplace-v6sh8\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.430504 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-init-846ffbb776-fc7k4" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.471904 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:40 crc kubenswrapper[4772]: I0320 11:11:40.738298 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6sh8"] Mar 20 11:11:41 crc kubenswrapper[4772]: I0320 11:11:41.419598 4772 generic.go:334] "Generic (PLEG): container finished" podID="01ef4996-bc22-42f3-b298-a27dddd23838" containerID="817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e" exitCode=0 Mar 20 11:11:41 crc kubenswrapper[4772]: I0320 11:11:41.419692 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6sh8" event={"ID":"01ef4996-bc22-42f3-b298-a27dddd23838","Type":"ContainerDied","Data":"817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e"} Mar 20 11:11:41 crc kubenswrapper[4772]: I0320 11:11:41.420208 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6sh8" event={"ID":"01ef4996-bc22-42f3-b298-a27dddd23838","Type":"ContainerStarted","Data":"392a3c6f515de228dd5508d8f67576f249efbc3c1dffaa3aa7839b20e7e822f1"} Mar 20 11:11:42 crc kubenswrapper[4772]: I0320 11:11:42.430583 4772 generic.go:334] "Generic (PLEG): container finished" podID="01ef4996-bc22-42f3-b298-a27dddd23838" containerID="bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c" exitCode=0 Mar 20 11:11:42 crc kubenswrapper[4772]: I0320 11:11:42.430630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6sh8" event={"ID":"01ef4996-bc22-42f3-b298-a27dddd23838","Type":"ContainerDied","Data":"bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c"} Mar 20 11:11:43 crc kubenswrapper[4772]: I0320 11:11:43.440115 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6sh8" event={"ID":"01ef4996-bc22-42f3-b298-a27dddd23838","Type":"ContainerStarted","Data":"df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152"} Mar 20 11:11:43 crc kubenswrapper[4772]: I0320 11:11:43.456485 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-v6sh8" podStartSLOduration=2.010496929 podStartE2EDuration="3.45647014s" podCreationTimestamp="2026-03-20 11:11:40 +0000 UTC" firstStartedPulling="2026-03-20 11:11:41.421372516 +0000 UTC m=+987.512338991" lastFinishedPulling="2026-03-20 11:11:42.867345717 +0000 UTC m=+988.958312202" observedRunningTime="2026-03-20 11:11:43.453777649 +0000 UTC m=+989.544744144" watchObservedRunningTime="2026-03-20 11:11:43.45647014 +0000 UTC m=+989.547436625" Mar 20 11:11:50 crc kubenswrapper[4772]: I0320 11:11:50.472428 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:50 crc kubenswrapper[4772]: I0320 11:11:50.473104 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:50 crc kubenswrapper[4772]: I0320 11:11:50.541296 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:50 crc kubenswrapper[4772]: I0320 11:11:50.589305 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:50 crc kubenswrapper[4772]: I0320 11:11:50.771513 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6sh8"] Mar 20 11:11:52 crc kubenswrapper[4772]: I0320 11:11:52.523871 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-v6sh8" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="registry-server" containerID="cri-o://df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152" gracePeriod=2 Mar 20 11:11:52 crc kubenswrapper[4772]: I0320 11:11:52.887316 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.000519 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-catalog-content\") pod \"01ef4996-bc22-42f3-b298-a27dddd23838\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.000606 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-utilities\") pod \"01ef4996-bc22-42f3-b298-a27dddd23838\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.000679 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c5n8x\" (UniqueName: \"kubernetes.io/projected/01ef4996-bc22-42f3-b298-a27dddd23838-kube-api-access-c5n8x\") pod \"01ef4996-bc22-42f3-b298-a27dddd23838\" (UID: \"01ef4996-bc22-42f3-b298-a27dddd23838\") " Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.001575 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-utilities" (OuterVolumeSpecName: "utilities") pod "01ef4996-bc22-42f3-b298-a27dddd23838" (UID: "01ef4996-bc22-42f3-b298-a27dddd23838"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.007208 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ef4996-bc22-42f3-b298-a27dddd23838-kube-api-access-c5n8x" (OuterVolumeSpecName: "kube-api-access-c5n8x") pod "01ef4996-bc22-42f3-b298-a27dddd23838" (UID: "01ef4996-bc22-42f3-b298-a27dddd23838"). InnerVolumeSpecName "kube-api-access-c5n8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.102685 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.102737 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c5n8x\" (UniqueName: \"kubernetes.io/projected/01ef4996-bc22-42f3-b298-a27dddd23838-kube-api-access-c5n8x\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.186517 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "01ef4996-bc22-42f3-b298-a27dddd23838" (UID: "01ef4996-bc22-42f3-b298-a27dddd23838"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.203760 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/01ef4996-bc22-42f3-b298-a27dddd23838-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.532288 4772 generic.go:334] "Generic (PLEG): container finished" podID="01ef4996-bc22-42f3-b298-a27dddd23838" containerID="df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152" exitCode=0 Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.532382 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6sh8" event={"ID":"01ef4996-bc22-42f3-b298-a27dddd23838","Type":"ContainerDied","Data":"df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152"} Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.532812 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-v6sh8" event={"ID":"01ef4996-bc22-42f3-b298-a27dddd23838","Type":"ContainerDied","Data":"392a3c6f515de228dd5508d8f67576f249efbc3c1dffaa3aa7839b20e7e822f1"} Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.532885 4772 scope.go:117] "RemoveContainer" containerID="df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.532402 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-v6sh8" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.552558 4772 scope.go:117] "RemoveContainer" containerID="bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.564492 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6sh8"] Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.570245 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-v6sh8"] Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.584026 4772 scope.go:117] "RemoveContainer" containerID="817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.598301 4772 scope.go:117] "RemoveContainer" containerID="df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152" Mar 20 11:11:53 crc kubenswrapper[4772]: E0320 11:11:53.598691 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152\": container with ID starting with df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152 not found: ID does not exist" containerID="df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.598726 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152"} err="failed to get container status \"df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152\": rpc error: code = NotFound desc = could not find container \"df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152\": container with ID starting with df9cda9147f0a3909f5a126f7d01412c21c45a18f0bbceb64c4a62d743044152 not found: ID does not exist" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.598766 4772 scope.go:117] "RemoveContainer" containerID="bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c" Mar 20 11:11:53 crc kubenswrapper[4772]: E0320 11:11:53.599174 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c\": container with ID starting with bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c not found: ID does not exist" containerID="bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.599218 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c"} err="failed to get container status \"bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c\": rpc error: code = NotFound desc = could not find container \"bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c\": container with ID starting with bce2fd6e4e725de4a4b8d0578a68853fb9c7c0ea596c71dfc97d24671eecf08c not found: ID does not exist" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.599244 4772 scope.go:117] "RemoveContainer" containerID="817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e" Mar 20 11:11:53 crc kubenswrapper[4772]: E0320 11:11:53.599590 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e\": container with ID starting with 817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e not found: ID does not exist" containerID="817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e" Mar 20 11:11:53 crc kubenswrapper[4772]: I0320 11:11:53.599622 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e"} err="failed to get container status \"817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e\": rpc error: code = NotFound desc = could not find container \"817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e\": container with ID starting with 817e6ed95fa9ad23b01cfcb50cafc2bc38f015422b531d0cc6ab21ca6658598e not found: ID does not exist" Mar 20 11:11:54 crc kubenswrapper[4772]: I0320 11:11:54.649338 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" path="/var/lib/kubelet/pods/01ef4996-bc22-42f3-b298-a27dddd23838/volumes" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.134574 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566752-85knh"] Mar 20 11:12:00 crc kubenswrapper[4772]: E0320 11:12:00.135265 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.135275 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4772]: E0320 11:12:00.135295 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.135302 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="extract-utilities" Mar 20 11:12:00 crc kubenswrapper[4772]: E0320 11:12:00.135317 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.135323 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="extract-content" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.135422 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="01ef4996-bc22-42f3-b298-a27dddd23838" containerName="registry-server" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.135794 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.138893 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.138985 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.139499 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.143152 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-85knh"] Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.197404 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7gp4\" (UniqueName: \"kubernetes.io/projected/51958443-f266-4496-9825-88985bff7a41-kube-api-access-b7gp4\") pod \"auto-csr-approver-29566752-85knh\" (UID: \"51958443-f266-4496-9825-88985bff7a41\") " pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.299013 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b7gp4\" (UniqueName: \"kubernetes.io/projected/51958443-f266-4496-9825-88985bff7a41-kube-api-access-b7gp4\") pod \"auto-csr-approver-29566752-85knh\" (UID: \"51958443-f266-4496-9825-88985bff7a41\") " pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.330722 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7gp4\" (UniqueName: \"kubernetes.io/projected/51958443-f266-4496-9825-88985bff7a41-kube-api-access-b7gp4\") pod \"auto-csr-approver-29566752-85knh\" (UID: \"51958443-f266-4496-9825-88985bff7a41\") " pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.453011 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:00 crc kubenswrapper[4772]: I0320 11:12:00.844061 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-85knh"] Mar 20 11:12:01 crc kubenswrapper[4772]: I0320 11:12:01.586287 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-85knh" event={"ID":"51958443-f266-4496-9825-88985bff7a41","Type":"ContainerStarted","Data":"92be90c90b1fe12a5f9f6204b9e22663b9ef6cd1057a9cd47d8e125a606c2c69"} Mar 20 11:12:02 crc kubenswrapper[4772]: I0320 11:12:02.594303 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-85knh" event={"ID":"51958443-f266-4496-9825-88985bff7a41","Type":"ContainerStarted","Data":"24b9aeccfcd22f0a9771929ce620113e303a493dae8973c020565b63c00aa39f"} Mar 20 11:12:02 crc kubenswrapper[4772]: I0320 11:12:02.613505 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566752-85knh" podStartSLOduration=1.515097275 podStartE2EDuration="2.613480947s" podCreationTimestamp="2026-03-20 11:12:00 +0000 UTC" firstStartedPulling="2026-03-20 11:12:00.860066796 +0000 UTC m=+1006.951033281" lastFinishedPulling="2026-03-20 11:12:01.958450468 +0000 UTC m=+1008.049416953" observedRunningTime="2026-03-20 11:12:02.610092238 +0000 UTC m=+1008.701058743" watchObservedRunningTime="2026-03-20 11:12:02.613480947 +0000 UTC m=+1008.704447432" Mar 20 11:12:03 crc kubenswrapper[4772]: I0320 11:12:03.602228 4772 generic.go:334] "Generic (PLEG): container finished" podID="51958443-f266-4496-9825-88985bff7a41" containerID="24b9aeccfcd22f0a9771929ce620113e303a493dae8973c020565b63c00aa39f" exitCode=0 Mar 20 11:12:03 crc kubenswrapper[4772]: I0320 11:12:03.602337 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-85knh" event={"ID":"51958443-f266-4496-9825-88985bff7a41","Type":"ContainerDied","Data":"24b9aeccfcd22f0a9771929ce620113e303a493dae8973c020565b63c00aa39f"} Mar 20 11:12:04 crc kubenswrapper[4772]: I0320 11:12:04.884208 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:04 crc kubenswrapper[4772]: I0320 11:12:04.970573 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-b7gp4\" (UniqueName: \"kubernetes.io/projected/51958443-f266-4496-9825-88985bff7a41-kube-api-access-b7gp4\") pod \"51958443-f266-4496-9825-88985bff7a41\" (UID: \"51958443-f266-4496-9825-88985bff7a41\") " Mar 20 11:12:04 crc kubenswrapper[4772]: I0320 11:12:04.976002 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/51958443-f266-4496-9825-88985bff7a41-kube-api-access-b7gp4" (OuterVolumeSpecName: "kube-api-access-b7gp4") pod "51958443-f266-4496-9825-88985bff7a41" (UID: "51958443-f266-4496-9825-88985bff7a41"). InnerVolumeSpecName "kube-api-access-b7gp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:12:05 crc kubenswrapper[4772]: I0320 11:12:05.072388 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-b7gp4\" (UniqueName: \"kubernetes.io/projected/51958443-f266-4496-9825-88985bff7a41-kube-api-access-b7gp4\") on node \"crc\" DevicePath \"\"" Mar 20 11:12:05 crc kubenswrapper[4772]: I0320 11:12:05.621925 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566752-85knh" event={"ID":"51958443-f266-4496-9825-88985bff7a41","Type":"ContainerDied","Data":"92be90c90b1fe12a5f9f6204b9e22663b9ef6cd1057a9cd47d8e125a606c2c69"} Mar 20 11:12:05 crc kubenswrapper[4772]: I0320 11:12:05.621987 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="92be90c90b1fe12a5f9f6204b9e22663b9ef6cd1057a9cd47d8e125a606c2c69" Mar 20 11:12:05 crc kubenswrapper[4772]: I0320 11:12:05.622044 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566752-85knh" Mar 20 11:12:05 crc kubenswrapper[4772]: I0320 11:12:05.708171 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-znz9n"] Mar 20 11:12:05 crc kubenswrapper[4772]: I0320 11:12:05.717977 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566746-znz9n"] Mar 20 11:12:06 crc kubenswrapper[4772]: I0320 11:12:06.649409 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5a30b4d8-69d6-4376-a53a-9011a7e681d9" path="/var/lib/kubelet/pods/5a30b4d8-69d6-4376-a53a-9011a7e681d9/volumes" Mar 20 11:12:09 crc kubenswrapper[4772]: I0320 11:12:09.564237 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:12:09 crc kubenswrapper[4772]: I0320 11:12:09.564309 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.368179 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d"] Mar 20 11:12:15 crc kubenswrapper[4772]: E0320 11:12:15.369143 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="51958443-f266-4496-9825-88985bff7a41" containerName="oc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.369160 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="51958443-f266-4496-9825-88985bff7a41" containerName="oc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.369350 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="51958443-f266-4496-9825-88985bff7a41" containerName="oc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.369891 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.372748 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"barbican-operator-controller-manager-dockercfg-f96l8" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.374832 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.375898 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.379512 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"cinder-operator-controller-manager-dockercfg-zmw6s" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.380709 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.389129 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.390138 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.391702 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"designate-operator-controller-manager-dockercfg-fw6tv" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.395273 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-scms7\" (UniqueName: \"kubernetes.io/projected/9980fd55-eca0-4c27-a021-59acc8681bfd-kube-api-access-scms7\") pod \"barbican-operator-controller-manager-59bc569d95-t4z2d\" (UID: \"9980fd55-eca0-4c27-a021-59acc8681bfd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.395379 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ppfh6\" (UniqueName: \"kubernetes.io/projected/f6d09f24-ca68-486c-8fb6-e34e3172077a-kube-api-access-ppfh6\") pod \"cinder-operator-controller-manager-8d58dc466-hlkjp\" (UID: \"f6d09f24-ca68-486c-8fb6-e34e3172077a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.410061 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.432714 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.434058 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.436295 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-zwk78" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.446431 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.447798 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.451471 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"heat-operator-controller-manager-dockercfg-6mn62" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.472082 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.479867 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.493873 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.494827 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.496172 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-scms7\" (UniqueName: \"kubernetes.io/projected/9980fd55-eca0-4c27-a021-59acc8681bfd-kube-api-access-scms7\") pod \"barbican-operator-controller-manager-59bc569d95-t4z2d\" (UID: \"9980fd55-eca0-4c27-a021-59acc8681bfd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.496255 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ppfh6\" (UniqueName: \"kubernetes.io/projected/f6d09f24-ca68-486c-8fb6-e34e3172077a-kube-api-access-ppfh6\") pod \"cinder-operator-controller-manager-8d58dc466-hlkjp\" (UID: \"f6d09f24-ca68-486c-8fb6-e34e3172077a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.496291 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mbnsc\" (UniqueName: \"kubernetes.io/projected/7ec0f9bb-bb24-4e95-8fb4-734eaee29058-kube-api-access-mbnsc\") pod \"glance-operator-controller-manager-79df6bcc97-224ts\" (UID: \"7ec0f9bb-bb24-4e95-8fb4-734eaee29058\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.496334 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zwzj2\" (UniqueName: \"kubernetes.io/projected/1854d930-56ec-441f-87d6-821b656cd195-kube-api-access-zwzj2\") pod \"heat-operator-controller-manager-67dd5f86f5-92hmc\" (UID: \"1854d930-56ec-441f-87d6-821b656cd195\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.496380 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/3b1e9a1f-4847-46ed-9239-11b64f01ef55-kube-api-access-kfg5c\") pod \"designate-operator-controller-manager-588d4d986b-62t9t\" (UID: \"3b1e9a1f-4847-46ed-9239-11b64f01ef55\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.500565 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.503053 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-r9vs4" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.527008 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.531199 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-scms7\" (UniqueName: \"kubernetes.io/projected/9980fd55-eca0-4c27-a021-59acc8681bfd-kube-api-access-scms7\") pod \"barbican-operator-controller-manager-59bc569d95-t4z2d\" (UID: \"9980fd55-eca0-4c27-a021-59acc8681bfd\") " pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.536217 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ppfh6\" (UniqueName: \"kubernetes.io/projected/f6d09f24-ca68-486c-8fb6-e34e3172077a-kube-api-access-ppfh6\") pod \"cinder-operator-controller-manager-8d58dc466-hlkjp\" (UID: \"f6d09f24-ca68-486c-8fb6-e34e3172077a\") " pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.544639 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.545632 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.553870 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-zwmnf" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.554215 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-webhook-server-cert" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.570586 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.594125 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.594906 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.599217 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ironic-operator-controller-manager-dockercfg-kgb9j" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.602789 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.602853 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mbnsc\" (UniqueName: \"kubernetes.io/projected/7ec0f9bb-bb24-4e95-8fb4-734eaee29058-kube-api-access-mbnsc\") pod \"glance-operator-controller-manager-79df6bcc97-224ts\" (UID: \"7ec0f9bb-bb24-4e95-8fb4-734eaee29058\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.602879 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zwzj2\" (UniqueName: \"kubernetes.io/projected/1854d930-56ec-441f-87d6-821b656cd195-kube-api-access-zwzj2\") pod \"heat-operator-controller-manager-67dd5f86f5-92hmc\" (UID: \"1854d930-56ec-441f-87d6-821b656cd195\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.602910 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pd9p\" (UniqueName: \"kubernetes.io/projected/a7308ded-ef41-499c-ae52-13d9e32b51e1-kube-api-access-2pd9p\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.602931 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/3b1e9a1f-4847-46ed-9239-11b64f01ef55-kube-api-access-kfg5c\") pod \"designate-operator-controller-manager-588d4d986b-62t9t\" (UID: \"3b1e9a1f-4847-46ed-9239-11b64f01ef55\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.602951 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5f8s6\" (UniqueName: \"kubernetes.io/projected/ab4a9bd4-c1e1-453d-b586-5089696704fb-kube-api-access-5f8s6\") pod \"horizon-operator-controller-manager-8464cc45fb-kh9lg\" (UID: \"ab4a9bd4-c1e1-453d-b586-5089696704fb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.623990 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.635508 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mbnsc\" (UniqueName: \"kubernetes.io/projected/7ec0f9bb-bb24-4e95-8fb4-734eaee29058-kube-api-access-mbnsc\") pod \"glance-operator-controller-manager-79df6bcc97-224ts\" (UID: \"7ec0f9bb-bb24-4e95-8fb4-734eaee29058\") " pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.638312 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.639393 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.643292 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-xzcpq" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.643503 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.644435 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.645918 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"manila-operator-controller-manager-dockercfg-pbn7j" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.654460 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zwzj2\" (UniqueName: \"kubernetes.io/projected/1854d930-56ec-441f-87d6-821b656cd195-kube-api-access-zwzj2\") pod \"heat-operator-controller-manager-67dd5f86f5-92hmc\" (UID: \"1854d930-56ec-441f-87d6-821b656cd195\") " pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.656602 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kfg5c\" (UniqueName: \"kubernetes.io/projected/3b1e9a1f-4847-46ed-9239-11b64f01ef55-kube-api-access-kfg5c\") pod \"designate-operator-controller-manager-588d4d986b-62t9t\" (UID: \"3b1e9a1f-4847-46ed-9239-11b64f01ef55\") " pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.656647 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.657369 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.661670 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-q4tcd" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.670918 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.689641 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705151 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pd9p\" (UniqueName: \"kubernetes.io/projected/a7308ded-ef41-499c-ae52-13d9e32b51e1-kube-api-access-2pd9p\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705202 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5f8s6\" (UniqueName: \"kubernetes.io/projected/ab4a9bd4-c1e1-453d-b586-5089696704fb-kube-api-access-5f8s6\") pod \"horizon-operator-controller-manager-8464cc45fb-kh9lg\" (UID: \"ab4a9bd4-c1e1-453d-b586-5089696704fb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705236 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7pz4\" (UniqueName: \"kubernetes.io/projected/127b144a-f395-409d-a0a4-b79b60a60c1f-kube-api-access-f7pz4\") pod \"ironic-operator-controller-manager-6f787dddc9-tcjvl\" (UID: \"127b144a-f395-409d-a0a4-b79b60a60c1f\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705266 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hrbc9\" (UniqueName: \"kubernetes.io/projected/52caed6a-6dcf-40d0-929c-948d0e421958-kube-api-access-hrbc9\") pod \"keystone-operator-controller-manager-768b96df4c-g8thw\" (UID: \"52caed6a-6dcf-40d0-929c-948d0e421958\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705323 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9ctxk\" (UniqueName: \"kubernetes.io/projected/953ed1af-7caa-4fe5-8443-3e5aa1caa77c-kube-api-access-9ctxk\") pod \"mariadb-operator-controller-manager-67ccfc9778-qcwm6\" (UID: \"953ed1af-7caa-4fe5-8443-3e5aa1caa77c\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705339 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.705372 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96fqz\" (UniqueName: \"kubernetes.io/projected/51140d29-3d51-44e1-a884-ffbad20bbb15-kube-api-access-96fqz\") pod \"manila-operator-controller-manager-55f864c847-bvmfv\" (UID: \"51140d29-3d51-44e1-a884-ffbad20bbb15\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:15 crc kubenswrapper[4772]: E0320 11:12:15.705979 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:15 crc kubenswrapper[4772]: E0320 11:12:15.706062 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert podName:a7308ded-ef41-499c-ae52-13d9e32b51e1 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:16.206045571 +0000 UTC m=+1022.297012056 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert") pod "infra-operator-controller-manager-669fff9c7c-7tx6m" (UID: "a7308ded-ef41-499c-ae52-13d9e32b51e1") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.712610 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.728227 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pd9p\" (UniqueName: \"kubernetes.io/projected/a7308ded-ef41-499c-ae52-13d9e32b51e1-kube-api-access-2pd9p\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.728318 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.730261 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.731209 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.733562 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5f8s6\" (UniqueName: \"kubernetes.io/projected/ab4a9bd4-c1e1-453d-b586-5089696704fb-kube-api-access-5f8s6\") pod \"horizon-operator-controller-manager-8464cc45fb-kh9lg\" (UID: \"ab4a9bd4-c1e1-453d-b586-5089696704fb\") " pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.737762 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"neutron-operator-controller-manager-dockercfg-5vptc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.762152 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.782699 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.788995 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.818290 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.818656 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.819054 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9ctxk\" (UniqueName: \"kubernetes.io/projected/953ed1af-7caa-4fe5-8443-3e5aa1caa77c-kube-api-access-9ctxk\") pod \"mariadb-operator-controller-manager-67ccfc9778-qcwm6\" (UID: \"953ed1af-7caa-4fe5-8443-3e5aa1caa77c\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.819099 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96fqz\" (UniqueName: \"kubernetes.io/projected/51140d29-3d51-44e1-a884-ffbad20bbb15-kube-api-access-96fqz\") pod \"manila-operator-controller-manager-55f864c847-bvmfv\" (UID: \"51140d29-3d51-44e1-a884-ffbad20bbb15\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.819438 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pl5sp\" (UniqueName: \"kubernetes.io/projected/e6248bcf-f076-4165-a9c7-0239c16e980d-kube-api-access-pl5sp\") pod \"neutron-operator-controller-manager-767865f676-wjc8s\" (UID: \"e6248bcf-f076-4165-a9c7-0239c16e980d\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.819480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f7pz4\" (UniqueName: \"kubernetes.io/projected/127b144a-f395-409d-a0a4-b79b60a60c1f-kube-api-access-f7pz4\") pod \"ironic-operator-controller-manager-6f787dddc9-tcjvl\" (UID: \"127b144a-f395-409d-a0a4-b79b60a60c1f\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.819521 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hrbc9\" (UniqueName: \"kubernetes.io/projected/52caed6a-6dcf-40d0-929c-948d0e421958-kube-api-access-hrbc9\") pod \"keystone-operator-controller-manager-768b96df4c-g8thw\" (UID: \"52caed6a-6dcf-40d0-929c-948d0e421958\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.827179 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.844923 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96fqz\" (UniqueName: \"kubernetes.io/projected/51140d29-3d51-44e1-a884-ffbad20bbb15-kube-api-access-96fqz\") pod \"manila-operator-controller-manager-55f864c847-bvmfv\" (UID: \"51140d29-3d51-44e1-a884-ffbad20bbb15\") " pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.851723 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hrbc9\" (UniqueName: \"kubernetes.io/projected/52caed6a-6dcf-40d0-929c-948d0e421958-kube-api-access-hrbc9\") pod \"keystone-operator-controller-manager-768b96df4c-g8thw\" (UID: \"52caed6a-6dcf-40d0-929c-948d0e421958\") " pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.863164 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.864270 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.868146 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.868998 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"nova-operator-controller-manager-dockercfg-qgcr2" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.869284 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7pz4\" (UniqueName: \"kubernetes.io/projected/127b144a-f395-409d-a0a4-b79b60a60c1f-kube-api-access-f7pz4\") pod \"ironic-operator-controller-manager-6f787dddc9-tcjvl\" (UID: \"127b144a-f395-409d-a0a4-b79b60a60c1f\") " pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.869831 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.874658 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.877267 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"octavia-operator-controller-manager-dockercfg-jm4q4" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.880578 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.886576 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9ctxk\" (UniqueName: \"kubernetes.io/projected/953ed1af-7caa-4fe5-8443-3e5aa1caa77c-kube-api-access-9ctxk\") pod \"mariadb-operator-controller-manager-67ccfc9778-qcwm6\" (UID: \"953ed1af-7caa-4fe5-8443-3e5aa1caa77c\") " pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.891989 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.893343 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.899360 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.899932 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"placement-operator-controller-manager-dockercfg-dqwgn" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.900486 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.902649 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.905009 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"ovn-operator-controller-manager-dockercfg-lftt6" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.913421 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.916495 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.919888 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.920728 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pl5sp\" (UniqueName: \"kubernetes.io/projected/e6248bcf-f076-4165-a9c7-0239c16e980d-kube-api-access-pl5sp\") pod \"neutron-operator-controller-manager-767865f676-wjc8s\" (UID: \"e6248bcf-f076-4165-a9c7-0239c16e980d\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.920768 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsntq\" (UniqueName: \"kubernetes.io/projected/5c181b83-da5d-454e-a061-c647bde19d5e-kube-api-access-wsntq\") pod \"octavia-operator-controller-manager-5b9f45d989-7p7tf\" (UID: \"5c181b83-da5d-454e-a061-c647bde19d5e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.920866 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bstl8\" (UniqueName: \"kubernetes.io/projected/546c5d3b-9054-45fb-9e45-95d01b61d012-kube-api-access-bstl8\") pod \"nova-operator-controller-manager-5d488d59fb-jxssq\" (UID: \"546c5d3b-9054-45fb-9e45-95d01b61d012\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.921432 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-8rbh7" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.927930 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.954109 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.956824 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pl5sp\" (UniqueName: \"kubernetes.io/projected/e6248bcf-f076-4165-a9c7-0239c16e980d-kube-api-access-pl5sp\") pod \"neutron-operator-controller-manager-767865f676-wjc8s\" (UID: \"e6248bcf-f076-4165-a9c7-0239c16e980d\") " pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.979239 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.980040 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.984513 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-webhook-server-cert" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.984535 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-baremetal-operator-controller-manager-dockercfg-bdp4k" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.986797 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9"] Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.987785 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:15 crc kubenswrapper[4772]: I0320 11:12:15.994680 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.002312 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"telemetry-operator-controller-manager-dockercfg-fcq8w" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.009391 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.013213 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.023260 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.024057 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84fbd\" (UniqueName: \"kubernetes.io/projected/55e1d800-c5ed-4902-bfb3-b36e761a526b-kube-api-access-84fbd\") pod \"swift-operator-controller-manager-c674c5965-lz8q4\" (UID: \"55e1d800-c5ed-4902-bfb3-b36e761a526b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.026009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wsntq\" (UniqueName: \"kubernetes.io/projected/5c181b83-da5d-454e-a061-c647bde19d5e-kube-api-access-wsntq\") pod \"octavia-operator-controller-manager-5b9f45d989-7p7tf\" (UID: \"5c181b83-da5d-454e-a061-c647bde19d5e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.026992 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbzks\" (UniqueName: \"kubernetes.io/projected/574f98d0-6e6f-43b0-a4d1-b13a5d123536-kube-api-access-fbzks\") pod \"ovn-operator-controller-manager-884679f54-qdjbq\" (UID: \"574f98d0-6e6f-43b0-a4d1-b13a5d123536\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.027201 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bstl8\" (UniqueName: \"kubernetes.io/projected/546c5d3b-9054-45fb-9e45-95d01b61d012-kube-api-access-bstl8\") pod \"nova-operator-controller-manager-5d488d59fb-jxssq\" (UID: \"546c5d3b-9054-45fb-9e45-95d01b61d012\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.027383 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fpkqj\" (UniqueName: \"kubernetes.io/projected/32c94d79-871d-465b-afe6-e929661093c6-kube-api-access-fpkqj\") pod \"placement-operator-controller-manager-5784578c99-jtzxg\" (UID: \"32c94d79-871d-465b-afe6-e929661093c6\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.024557 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.036932 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"test-operator-controller-manager-dockercfg-b55mz" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.045408 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.045460 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgnzz\" (UniqueName: \"kubernetes.io/projected/a1c1c7a4-6840-4e52-b242-7de225eaac97-kube-api-access-bgnzz\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.045522 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7qz5b\" (UniqueName: \"kubernetes.io/projected/d85081ab-dafc-467c-9445-9b7b221a56ee-kube-api-access-7qz5b\") pod \"telemetry-operator-controller-manager-d6b694c5-nb5t9\" (UID: \"d85081ab-dafc-467c-9445-9b7b221a56ee\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.045636 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.048330 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.061516 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsntq\" (UniqueName: \"kubernetes.io/projected/5c181b83-da5d-454e-a061-c647bde19d5e-kube-api-access-wsntq\") pod \"octavia-operator-controller-manager-5b9f45d989-7p7tf\" (UID: \"5c181b83-da5d-454e-a061-c647bde19d5e\") " pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.074054 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.083920 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bstl8\" (UniqueName: \"kubernetes.io/projected/546c5d3b-9054-45fb-9e45-95d01b61d012-kube-api-access-bstl8\") pod \"nova-operator-controller-manager-5d488d59fb-jxssq\" (UID: \"546c5d3b-9054-45fb-9e45-95d01b61d012\") " pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.095939 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.135951 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.136770 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.141705 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"watcher-operator-controller-manager-dockercfg-6z7lc" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.146784 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.147228 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgnzz\" (UniqueName: \"kubernetes.io/projected/a1c1c7a4-6840-4e52-b242-7de225eaac97-kube-api-access-bgnzz\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.147375 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7qz5b\" (UniqueName: \"kubernetes.io/projected/d85081ab-dafc-467c-9445-9b7b221a56ee-kube-api-access-7qz5b\") pod \"telemetry-operator-controller-manager-d6b694c5-nb5t9\" (UID: \"d85081ab-dafc-467c-9445-9b7b221a56ee\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.147499 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84fbd\" (UniqueName: \"kubernetes.io/projected/55e1d800-c5ed-4902-bfb3-b36e761a526b-kube-api-access-84fbd\") pod \"swift-operator-controller-manager-c674c5965-lz8q4\" (UID: \"55e1d800-c5ed-4902-bfb3-b36e761a526b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.147629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbzks\" (UniqueName: \"kubernetes.io/projected/574f98d0-6e6f-43b0-a4d1-b13a5d123536-kube-api-access-fbzks\") pod \"ovn-operator-controller-manager-884679f54-qdjbq\" (UID: \"574f98d0-6e6f-43b0-a4d1-b13a5d123536\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.147748 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvpcg\" (UniqueName: \"kubernetes.io/projected/e23924f7-44e6-4464-a6d6-240718124df8-kube-api-access-kvpcg\") pod \"test-operator-controller-manager-5c5cb9c4d7-s5gs7\" (UID: \"e23924f7-44e6-4464-a6d6-240718124df8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.147927 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fpkqj\" (UniqueName: \"kubernetes.io/projected/32c94d79-871d-465b-afe6-e929661093c6-kube-api-access-fpkqj\") pod \"placement-operator-controller-manager-5784578c99-jtzxg\" (UID: \"32c94d79-871d-465b-afe6-e929661093c6\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.146934 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c"] Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.146992 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.148769 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert podName:a1c1c7a4-6840-4e52-b242-7de225eaac97 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:16.648748238 +0000 UTC m=+1022.739714723 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cnssm" (UID: "a1c1c7a4-6840-4e52-b242-7de225eaac97") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.173470 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgnzz\" (UniqueName: \"kubernetes.io/projected/a1c1c7a4-6840-4e52-b242-7de225eaac97-kube-api-access-bgnzz\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.175998 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fpkqj\" (UniqueName: \"kubernetes.io/projected/32c94d79-871d-465b-afe6-e929661093c6-kube-api-access-fpkqj\") pod \"placement-operator-controller-manager-5784578c99-jtzxg\" (UID: \"32c94d79-871d-465b-afe6-e929661093c6\") " pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.179941 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7qz5b\" (UniqueName: \"kubernetes.io/projected/d85081ab-dafc-467c-9445-9b7b221a56ee-kube-api-access-7qz5b\") pod \"telemetry-operator-controller-manager-d6b694c5-nb5t9\" (UID: \"d85081ab-dafc-467c-9445-9b7b221a56ee\") " pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.181875 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84fbd\" (UniqueName: \"kubernetes.io/projected/55e1d800-c5ed-4902-bfb3-b36e761a526b-kube-api-access-84fbd\") pod \"swift-operator-controller-manager-c674c5965-lz8q4\" (UID: \"55e1d800-c5ed-4902-bfb3-b36e761a526b\") " pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.190444 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbzks\" (UniqueName: \"kubernetes.io/projected/574f98d0-6e6f-43b0-a4d1-b13a5d123536-kube-api-access-fbzks\") pod \"ovn-operator-controller-manager-884679f54-qdjbq\" (UID: \"574f98d0-6e6f-43b0-a4d1-b13a5d123536\") " pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.190855 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.192040 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.193184 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.197228 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.197305 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"metrics-server-cert" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.217204 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"openstack-operator-controller-manager-dockercfg-tspw7" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.242732 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.248891 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.259870 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kvpcg\" (UniqueName: \"kubernetes.io/projected/e23924f7-44e6-4464-a6d6-240718124df8-kube-api-access-kvpcg\") pod \"test-operator-controller-manager-5c5cb9c4d7-s5gs7\" (UID: \"e23924f7-44e6-4464-a6d6-240718124df8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.259959 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.260002 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.260030 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c9d7\" (UniqueName: \"kubernetes.io/projected/48ac539f-199f-49e4-8330-8956df8ea12f-kube-api-access-5c9d7\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.260068 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk5js\" (UniqueName: \"kubernetes.io/projected/3f853c54-2e6e-41b9-b15f-3435b17477f2-kube-api-access-rk5js\") pod \"watcher-operator-controller-manager-6c4d75f7f9-f488c\" (UID: \"3f853c54-2e6e-41b9-b15f-3435b17477f2\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.260132 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.260298 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.260368 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert podName:a7308ded-ef41-499c-ae52-13d9e32b51e1 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:17.260349079 +0000 UTC m=+1023.351315574 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert") pod "infra-operator-controller-manager-669fff9c7c-7tx6m" (UID: "a7308ded-ef41-499c-ae52-13d9e32b51e1") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.286459 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvpcg\" (UniqueName: \"kubernetes.io/projected/e23924f7-44e6-4464-a6d6-240718124df8-kube-api-access-kvpcg\") pod \"test-operator-controller-manager-5c5cb9c4d7-s5gs7\" (UID: \"e23924f7-44e6-4464-a6d6-240718124df8\") " pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.303526 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.325254 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.343420 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.363348 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.363408 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.363439 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5c9d7\" (UniqueName: \"kubernetes.io/projected/48ac539f-199f-49e4-8330-8956df8ea12f-kube-api-access-5c9d7\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.363480 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk5js\" (UniqueName: \"kubernetes.io/projected/3f853c54-2e6e-41b9-b15f-3435b17477f2-kube-api-access-rk5js\") pod \"watcher-operator-controller-manager-6c4d75f7f9-f488c\" (UID: \"3f853c54-2e6e-41b9-b15f-3435b17477f2\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.363823 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.363988 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:16.86391889 +0000 UTC m=+1022.954885375 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "metrics-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.364108 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.364153 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:16.864140137 +0000 UTC m=+1022.955106622 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.374720 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.380117 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.387540 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk5js\" (UniqueName: \"kubernetes.io/projected/3f853c54-2e6e-41b9-b15f-3435b17477f2-kube-api-access-rk5js\") pod \"watcher-operator-controller-manager-6c4d75f7f9-f488c\" (UID: \"3f853c54-2e6e-41b9-b15f-3435b17477f2\") " pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.393947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c9d7\" (UniqueName: \"kubernetes.io/projected/48ac539f-199f-49e4-8330-8956df8ea12f-kube-api-access-5c9d7\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.394235 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.394629 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.407447 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.422523 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:16 crc kubenswrapper[4772]: W0320 11:12:16.452999 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9980fd55_eca0_4c27_a021_59acc8681bfd.slice/crio-4fd9806dd076388c9f9a4ed366c5f58481c0ebbdeeaf725a6a645e39d578ad64 WatchSource:0}: Error finding container 4fd9806dd076388c9f9a4ed366c5f58481c0ebbdeeaf725a6a645e39d578ad64: Status 404 returned error can't find the container with id 4fd9806dd076388c9f9a4ed366c5f58481c0ebbdeeaf725a6a645e39d578ad64 Mar 20 11:12:16 crc kubenswrapper[4772]: W0320 11:12:16.464925 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b1e9a1f_4847_46ed_9239_11b64f01ef55.slice/crio-b4fbba3be19fba0a527f89ae84c594b5e31d44ba43c128709b5f84d73b01c6f9 WatchSource:0}: Error finding container b4fbba3be19fba0a527f89ae84c594b5e31d44ba43c128709b5f84d73b01c6f9: Status 404 returned error can't find the container with id b4fbba3be19fba0a527f89ae84c594b5e31d44ba43c128709b5f84d73b01c6f9 Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.477013 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.668157 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.668564 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.668614 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert podName:a1c1c7a4-6840-4e52-b242-7de225eaac97 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:17.668600508 +0000 UTC m=+1023.759566993 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cnssm" (UID: "a1c1c7a4-6840-4e52-b242-7de225eaac97") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.705008 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" event={"ID":"7ec0f9bb-bb24-4e95-8fb4-734eaee29058","Type":"ContainerStarted","Data":"ce24aa6d6268639e984f3a9f462f86704a38f4db2159e0de25f37320360522b4"} Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.710630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" event={"ID":"3b1e9a1f-4847-46ed-9239-11b64f01ef55","Type":"ContainerStarted","Data":"b4fbba3be19fba0a527f89ae84c594b5e31d44ba43c128709b5f84d73b01c6f9"} Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.711394 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" event={"ID":"9980fd55-eca0-4c27-a021-59acc8681bfd","Type":"ContainerStarted","Data":"4fd9806dd076388c9f9a4ed366c5f58481c0ebbdeeaf725a6a645e39d578ad64"} Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.712160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" event={"ID":"f6d09f24-ca68-486c-8fb6-e34e3172077a","Type":"ContainerStarted","Data":"9bded33eeda6263e7c15fe66586dea162ca993e7144f6a9dab915fd7202c39ca"} Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.835519 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.859963 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.864238 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc"] Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.871060 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.871267 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.871323 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.871344 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:17.871323888 +0000 UTC m=+1023.962290453 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "webhook-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.871461 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: E0320 11:12:16.871530 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:17.871512354 +0000 UTC m=+1023.962478919 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "metrics-server-cert" not found Mar 20 11:12:16 crc kubenswrapper[4772]: W0320 11:12:16.876729 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab4a9bd4_c1e1_453d_b586_5089696704fb.slice/crio-3683850c6c0568038255884c878824a5cfe61961ef925b6b23c621f731eb4ad2 WatchSource:0}: Error finding container 3683850c6c0568038255884c878824a5cfe61961ef925b6b23c621f731eb4ad2: Status 404 returned error can't find the container with id 3683850c6c0568038255884c878824a5cfe61961ef925b6b23c621f731eb4ad2 Mar 20 11:12:16 crc kubenswrapper[4772]: W0320 11:12:16.881877 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1854d930_56ec_441f_87d6_821b656cd195.slice/crio-ee6dad8b84d3ff683c60e8e2235ec52d06edbab2caa1e1ebca5491005b4ab570 WatchSource:0}: Error finding container ee6dad8b84d3ff683c60e8e2235ec52d06edbab2caa1e1ebca5491005b4ab570: Status 404 returned error can't find the container with id ee6dad8b84d3ff683c60e8e2235ec52d06edbab2caa1e1ebca5491005b4ab570 Mar 20 11:12:16 crc kubenswrapper[4772]: I0320 11:12:16.976432 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw"] Mar 20 11:12:16 crc kubenswrapper[4772]: W0320 11:12:16.983667 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52caed6a_6dcf_40d0_929c_948d0e421958.slice/crio-b8cf2d12b9d477ee73892118c493481e62e6ca5690753da73fb0db7fe8ff49b7 WatchSource:0}: Error finding container b8cf2d12b9d477ee73892118c493481e62e6ca5690753da73fb0db7fe8ff49b7: Status 404 returned error can't find the container with id b8cf2d12b9d477ee73892118c493481e62e6ca5690753da73fb0db7fe8ff49b7 Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.016388 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s"] Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.210187 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf"] Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.215694 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5c181b83_da5d_454e_a061_c647bde19d5e.slice/crio-6e8af272a28fd62d3cd5f845021b2983d380eb75ff36788623fb2640de871609 WatchSource:0}: Error finding container 6e8af272a28fd62d3cd5f845021b2983d380eb75ff36788623fb2640de871609: Status 404 returned error can't find the container with id 6e8af272a28fd62d3cd5f845021b2983d380eb75ff36788623fb2640de871609 Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.278431 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.278644 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.278748 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert podName:a7308ded-ef41-499c-ae52-13d9e32b51e1 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.278722543 +0000 UTC m=+1025.369689038 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert") pod "infra-operator-controller-manager-669fff9c7c-7tx6m" (UID: "a7308ded-ef41-499c-ae52-13d9e32b51e1") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.384895 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7"] Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.392915 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode23924f7_44e6_4464_a6d6_240718124df8.slice/crio-cdb1cf53a60f4a54d3a6d0db319a6ce2ac22a5ac465ba7c1dbf631cae61d4671 WatchSource:0}: Error finding container cdb1cf53a60f4a54d3a6d0db319a6ce2ac22a5ac465ba7c1dbf631cae61d4671: Status 404 returned error can't find the container with id cdb1cf53a60f4a54d3a6d0db319a6ce2ac22a5ac465ba7c1dbf631cae61d4671 Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.419435 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6"] Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.433190 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod953ed1af_7caa_4fe5_8443_3e5aa1caa77c.slice/crio-4a2c0cb3da4e7564a5655b162cec3cd63ae93fa620701bb61fdb9630db5d7d61 WatchSource:0}: Error finding container 4a2c0cb3da4e7564a5655b162cec3cd63ae93fa620701bb61fdb9630db5d7d61: Status 404 returned error can't find the container with id 4a2c0cb3da4e7564a5655b162cec3cd63ae93fa620701bb61fdb9630db5d7d61 Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.437677 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq"] Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.451385 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4"] Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.457287 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3f853c54_2e6e_41b9_b15f_3435b17477f2.slice/crio-f3ad99c659ea5035bbc1d56ef7b3108aa26da0f0679cf9e71ef1846c36e03f0c WatchSource:0}: Error finding container f3ad99c659ea5035bbc1d56ef7b3108aa26da0f0679cf9e71ef1846c36e03f0c: Status 404 returned error can't find the container with id f3ad99c659ea5035bbc1d56ef7b3108aa26da0f0679cf9e71ef1846c36e03f0c Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.461422 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c"] Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.461597 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod546c5d3b_9054_45fb_9e45_95d01b61d012.slice/crio-e869ec13df960124c9a2e306d8cbd6c6079303ebe7a988ec0f645f7e18e76c2c WatchSource:0}: Error finding container e869ec13df960124c9a2e306d8cbd6c6079303ebe7a988ec0f645f7e18e76c2c: Status 404 returned error can't find the container with id e869ec13df960124c9a2e306d8cbd6c6079303ebe7a988ec0f645f7e18e76c2c Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.476000 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq"] Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.476089 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg"] Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.480412 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv"] Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.480847 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9"] Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.505806 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9ctxk,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod mariadb-operator-controller-manager-67ccfc9778-qcwm6_openstack-operators(953ed1af-7caa-4fe5-8443-3e5aa1caa77c): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.515946 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" podUID="953ed1af-7caa-4fe5-8443-3e5aa1caa77c" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.526267 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-fbzks,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod ovn-operator-controller-manager-884679f54-qdjbq_openstack-operators(574f98d0-6e6f-43b0-a4d1-b13a5d123536): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.531366 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" podUID="574f98d0-6e6f-43b0-a4d1-b13a5d123536" Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.601479 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51140d29_3d51_44e1_a884_ffbad20bbb15.slice/crio-1bae335ff04f411d175711c89b0c41a8ff7d76b46fde7762da46f7c7ae43570d WatchSource:0}: Error finding container 1bae335ff04f411d175711c89b0c41a8ff7d76b46fde7762da46f7c7ae43570d: Status 404 returned error can't find the container with id 1bae335ff04f411d175711c89b0c41a8ff7d76b46fde7762da46f7c7ae43570d Mar 20 11:12:17 crc kubenswrapper[4772]: W0320 11:12:17.605647 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd85081ab_dafc_467c_9445_9b7b221a56ee.slice/crio-8774d1b8b46baa55b6f4c427367cdd62d68b41bd1f1a7e91dee95519efe9a5de WatchSource:0}: Error finding container 8774d1b8b46baa55b6f4c427367cdd62d68b41bd1f1a7e91dee95519efe9a5de: Status 404 returned error can't find the container with id 8774d1b8b46baa55b6f4c427367cdd62d68b41bd1f1a7e91dee95519efe9a5de Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.613277 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-96fqz,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod manila-operator-controller-manager-55f864c847-bvmfv_openstack-operators(51140d29-3d51-44e1-a884-ffbad20bbb15): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.614974 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" podUID="51140d29-3d51-44e1-a884-ffbad20bbb15" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.620183 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-7qz5b,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod telemetry-operator-controller-manager-d6b694c5-nb5t9_openstack-operators(d85081ab-dafc-467c-9445-9b7b221a56ee): ErrImagePull: pull QPS exceeded" logger="UnhandledError" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.621367 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"pull QPS exceeded\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" podUID="d85081ab-dafc-467c-9445-9b7b221a56ee" Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.685495 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.685694 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.685783 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert podName:a1c1c7a4-6840-4e52-b242-7de225eaac97 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.685762308 +0000 UTC m=+1025.776728843 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cnssm" (UID: "a1c1c7a4-6840-4e52-b242-7de225eaac97") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.732231 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" event={"ID":"127b144a-f395-409d-a0a4-b79b60a60c1f","Type":"ContainerStarted","Data":"bc742f9e64e1fdd11391468b152430c57b18070d835f4273addae1dc40578e6f"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.735333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" event={"ID":"1854d930-56ec-441f-87d6-821b656cd195","Type":"ContainerStarted","Data":"ee6dad8b84d3ff683c60e8e2235ec52d06edbab2caa1e1ebca5491005b4ab570"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.737023 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" event={"ID":"55e1d800-c5ed-4902-bfb3-b36e761a526b","Type":"ContainerStarted","Data":"77eb87710ea44f2fa2a82991436552721048190aa154ac9e4213fd164351b38a"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.739944 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" event={"ID":"3f853c54-2e6e-41b9-b15f-3435b17477f2","Type":"ContainerStarted","Data":"f3ad99c659ea5035bbc1d56ef7b3108aa26da0f0679cf9e71ef1846c36e03f0c"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.742210 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" event={"ID":"51140d29-3d51-44e1-a884-ffbad20bbb15","Type":"ContainerStarted","Data":"1bae335ff04f411d175711c89b0c41a8ff7d76b46fde7762da46f7c7ae43570d"} Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.744595 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" podUID="51140d29-3d51-44e1-a884-ffbad20bbb15" Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.752708 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" event={"ID":"d85081ab-dafc-467c-9445-9b7b221a56ee","Type":"ContainerStarted","Data":"8774d1b8b46baa55b6f4c427367cdd62d68b41bd1f1a7e91dee95519efe9a5de"} Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.754898 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" podUID="d85081ab-dafc-467c-9445-9b7b221a56ee" Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.777705 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" event={"ID":"574f98d0-6e6f-43b0-a4d1-b13a5d123536","Type":"ContainerStarted","Data":"178d266830db80e526f3866c5f288b28290a252114e0543f4156ba387db09b81"} Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.782808 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" podUID="574f98d0-6e6f-43b0-a4d1-b13a5d123536" Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.789321 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" event={"ID":"52caed6a-6dcf-40d0-929c-948d0e421958","Type":"ContainerStarted","Data":"b8cf2d12b9d477ee73892118c493481e62e6ca5690753da73fb0db7fe8ff49b7"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.791531 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" event={"ID":"32c94d79-871d-465b-afe6-e929661093c6","Type":"ContainerStarted","Data":"c57d6fe6784b3b2db6841c4d4d929f4c944f8e6216dd150004fd5e0d06119063"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.796782 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" event={"ID":"953ed1af-7caa-4fe5-8443-3e5aa1caa77c","Type":"ContainerStarted","Data":"4a2c0cb3da4e7564a5655b162cec3cd63ae93fa620701bb61fdb9630db5d7d61"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.801876 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" event={"ID":"e23924f7-44e6-4464-a6d6-240718124df8","Type":"ContainerStarted","Data":"cdb1cf53a60f4a54d3a6d0db319a6ce2ac22a5ac465ba7c1dbf631cae61d4671"} Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.802058 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" podUID="953ed1af-7caa-4fe5-8443-3e5aa1caa77c" Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.803830 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" event={"ID":"5c181b83-da5d-454e-a061-c647bde19d5e","Type":"ContainerStarted","Data":"6e8af272a28fd62d3cd5f845021b2983d380eb75ff36788623fb2640de871609"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.808091 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" event={"ID":"546c5d3b-9054-45fb-9e45-95d01b61d012","Type":"ContainerStarted","Data":"e869ec13df960124c9a2e306d8cbd6c6079303ebe7a988ec0f645f7e18e76c2c"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.818869 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" event={"ID":"ab4a9bd4-c1e1-453d-b586-5089696704fb","Type":"ContainerStarted","Data":"3683850c6c0568038255884c878824a5cfe61961ef925b6b23c621f731eb4ad2"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.822887 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" event={"ID":"e6248bcf-f076-4165-a9c7-0239c16e980d","Type":"ContainerStarted","Data":"69b1dec3e5467a8bb5fff735bd09e638712baebbd69355ae4cca9730d7c4fe53"} Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.891483 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:17 crc kubenswrapper[4772]: I0320 11:12:17.891533 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.891708 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.891761 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.891744678 +0000 UTC m=+1025.982711163 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "webhook-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.893585 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:17 crc kubenswrapper[4772]: E0320 11:12:17.893873 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:19.893857017 +0000 UTC m=+1025.984823502 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "metrics-server-cert" not found Mar 20 11:12:18 crc kubenswrapper[4772]: E0320 11:12:18.857212 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/mariadb-operator@sha256:6e7552996253fc66667eaa3eb0e11b4e97145efa2ae577155ceabf8e9913ddc1\\\"\"" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" podUID="953ed1af-7caa-4fe5-8443-3e5aa1caa77c" Mar 20 11:12:18 crc kubenswrapper[4772]: E0320 11:12:18.857993 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/telemetry-operator@sha256:c500fa7080b94105e85eeced772d8872e4168904e74ba02116e15ab66f522444\\\"\"" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" podUID="d85081ab-dafc-467c-9445-9b7b221a56ee" Mar 20 11:12:18 crc kubenswrapper[4772]: E0320 11:12:18.858039 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/manila-operator@sha256:f2e0b0fb34995b8acbbf1b0b60b5dbcf488b4f3899d1bb0763ae7dcee9bae6da\\\"\"" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" podUID="51140d29-3d51-44e1-a884-ffbad20bbb15" Mar 20 11:12:18 crc kubenswrapper[4772]: E0320 11:12:18.874543 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/ovn-operator@sha256:bef93f71d3b42a72d8b96c69bdb4db4b8bd797c5093a0a719443d7a5c9aaab55\\\"\"" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" podUID="574f98d0-6e6f-43b0-a4d1-b13a5d123536" Mar 20 11:12:19 crc kubenswrapper[4772]: I0320 11:12:19.322582 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.322864 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.322920 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert podName:a7308ded-ef41-499c-ae52-13d9e32b51e1 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:23.322902455 +0000 UTC m=+1029.413868940 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert") pod "infra-operator-controller-manager-669fff9c7c-7tx6m" (UID: "a7308ded-ef41-499c-ae52-13d9e32b51e1") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: I0320 11:12:19.730564 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.730718 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.730804 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert podName:a1c1c7a4-6840-4e52-b242-7de225eaac97 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:23.730781103 +0000 UTC m=+1029.821747588 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cnssm" (UID: "a1c1c7a4-6840-4e52-b242-7de225eaac97") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: I0320 11:12:19.941477 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:19 crc kubenswrapper[4772]: I0320 11:12:19.941616 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.941757 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.941816 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.941899 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:23.941877245 +0000 UTC m=+1030.032843730 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "metrics-server-cert" not found Mar 20 11:12:19 crc kubenswrapper[4772]: E0320 11:12:19.941927 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:23.941916446 +0000 UTC m=+1030.032883141 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "webhook-server-cert" not found Mar 20 11:12:23 crc kubenswrapper[4772]: I0320 11:12:23.405382 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:23 crc kubenswrapper[4772]: E0320 11:12:23.405708 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:23 crc kubenswrapper[4772]: E0320 11:12:23.405983 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert podName:a7308ded-ef41-499c-ae52-13d9e32b51e1 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:31.405953793 +0000 UTC m=+1037.496920278 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert") pod "infra-operator-controller-manager-669fff9c7c-7tx6m" (UID: "a7308ded-ef41-499c-ae52-13d9e32b51e1") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:23 crc kubenswrapper[4772]: I0320 11:12:23.811734 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:23 crc kubenswrapper[4772]: E0320 11:12:23.812036 4772 secret.go:188] Couldn't get secret openstack-operators/openstack-baremetal-operator-webhook-server-cert: secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:23 crc kubenswrapper[4772]: E0320 11:12:23.812098 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert podName:a1c1c7a4-6840-4e52-b242-7de225eaac97 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:31.812077113 +0000 UTC m=+1037.903043628 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert") pod "openstack-baremetal-operator-controller-manager-89d64c458-cnssm" (UID: "a1c1c7a4-6840-4e52-b242-7de225eaac97") : secret "openstack-baremetal-operator-webhook-server-cert" not found Mar 20 11:12:24 crc kubenswrapper[4772]: I0320 11:12:24.014724 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:24 crc kubenswrapper[4772]: I0320 11:12:24.015126 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:24 crc kubenswrapper[4772]: E0320 11:12:24.014988 4772 secret.go:188] Couldn't get secret openstack-operators/metrics-server-cert: secret "metrics-server-cert" not found Mar 20 11:12:24 crc kubenswrapper[4772]: E0320 11:12:24.015256 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:32.015224606 +0000 UTC m=+1038.106191091 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "metrics-server-cert" not found Mar 20 11:12:24 crc kubenswrapper[4772]: E0320 11:12:24.015282 4772 secret.go:188] Couldn't get secret openstack-operators/webhook-server-cert: secret "webhook-server-cert" not found Mar 20 11:12:24 crc kubenswrapper[4772]: E0320 11:12:24.015345 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs podName:48ac539f-199f-49e4-8330-8956df8ea12f nodeName:}" failed. No retries permitted until 2026-03-20 11:12:32.015325909 +0000 UTC m=+1038.106292464 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "webhook-certs" (UniqueName: "kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs") pod "openstack-operator-controller-manager-6697dffbc-xp244" (UID: "48ac539f-199f-49e4-8330-8956df8ea12f") : secret "webhook-server-cert" not found Mar 20 11:12:28 crc kubenswrapper[4772]: I0320 11:12:28.582741 4772 scope.go:117] "RemoveContainer" containerID="42beb27ed312c971ba6b1e765200b7fff652794381276c3378000aa392f8c5f9" Mar 20 11:12:29 crc kubenswrapper[4772]: E0320 11:12:29.484812 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d" Mar 20 11:12:29 crc kubenswrapper[4772]: E0320 11:12:29.485372 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-mbnsc,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod glance-operator-controller-manager-79df6bcc97-224ts_openstack-operators(7ec0f9bb-bb24-4e95-8fb4-734eaee29058): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:29 crc kubenswrapper[4772]: E0320 11:12:29.486554 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" podUID="7ec0f9bb-bb24-4e95-8fb4-734eaee29058" Mar 20 11:12:29 crc kubenswrapper[4772]: E0320 11:12:29.946083 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/glance-operator@sha256:76a1cde9f29fb39ed715b06be16adb803b9a2e24d68acb369911c0a88e33bc7d\\\"\"" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" podUID="7ec0f9bb-bb24-4e95-8fb4-734eaee29058" Mar 20 11:12:30 crc kubenswrapper[4772]: E0320 11:12:30.093091 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a" Mar 20 11:12:30 crc kubenswrapper[4772]: E0320 11:12:30.093288 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-pl5sp,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod neutron-operator-controller-manager-767865f676-wjc8s_openstack-operators(e6248bcf-f076-4165-a9c7-0239c16e980d): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:30 crc kubenswrapper[4772]: E0320 11:12:30.094913 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" podUID="e6248bcf-f076-4165-a9c7-0239c16e980d" Mar 20 11:12:30 crc kubenswrapper[4772]: E0320 11:12:30.950990 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/neutron-operator@sha256:526f9d4965431e1a5e4f8c3224bcee3f636a3108a5e0767296a994c2a517404a\\\"\"" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" podUID="e6248bcf-f076-4165-a9c7-0239c16e980d" Mar 20 11:12:31 crc kubenswrapper[4772]: I0320 11:12:31.426549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:31 crc kubenswrapper[4772]: E0320 11:12:31.426783 4772 secret.go:188] Couldn't get secret openstack-operators/infra-operator-webhook-server-cert: secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:31 crc kubenswrapper[4772]: E0320 11:12:31.427116 4772 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert podName:a7308ded-ef41-499c-ae52-13d9e32b51e1 nodeName:}" failed. No retries permitted until 2026-03-20 11:12:47.427097352 +0000 UTC m=+1053.518063827 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert") pod "infra-operator-controller-manager-669fff9c7c-7tx6m" (UID: "a7308ded-ef41-499c-ae52-13d9e32b51e1") : secret "infra-operator-webhook-server-cert" not found Mar 20 11:12:31 crc kubenswrapper[4772]: I0320 11:12:31.832814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:31 crc kubenswrapper[4772]: I0320 11:12:31.840913 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a1c1c7a4-6840-4e52-b242-7de225eaac97-cert\") pod \"openstack-baremetal-operator-controller-manager-89d64c458-cnssm\" (UID: \"a1c1c7a4-6840-4e52-b242-7de225eaac97\") " pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:31 crc kubenswrapper[4772]: I0320 11:12:31.965723 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:32 crc kubenswrapper[4772]: I0320 11:12:32.036374 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:32 crc kubenswrapper[4772]: I0320 11:12:32.036686 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:32 crc kubenswrapper[4772]: I0320 11:12:32.040274 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-webhook-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:32 crc kubenswrapper[4772]: I0320 11:12:32.040288 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/48ac539f-199f-49e4-8330-8956df8ea12f-metrics-certs\") pod \"openstack-operator-controller-manager-6697dffbc-xp244\" (UID: \"48ac539f-199f-49e4-8330-8956df8ea12f\") " pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:32 crc kubenswrapper[4772]: I0320 11:12:32.155142 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:36 crc kubenswrapper[4772]: E0320 11:12:36.964554 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777" Mar 20 11:12:36 crc kubenswrapper[4772]: E0320 11:12:36.965830 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-ppfh6,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod cinder-operator-controller-manager-8d58dc466-hlkjp_openstack-operators(f6d09f24-ca68-486c-8fb6-e34e3172077a): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:36 crc kubenswrapper[4772]: E0320 11:12:36.967107 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" podUID="f6d09f24-ca68-486c-8fb6-e34e3172077a" Mar 20 11:12:36 crc kubenswrapper[4772]: E0320 11:12:36.988955 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/cinder-operator@sha256:d8210bb21d4d298271a7b43f92fe58789393546e616aaaec1ce71bb2a754e777\\\"\"" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" podUID="f6d09f24-ca68-486c-8fb6-e34e3172077a" Mar 20 11:12:37 crc kubenswrapper[4772]: E0320 11:12:37.714966 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900" Mar 20 11:12:37 crc kubenswrapper[4772]: E0320 11:12:37.715514 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-zwzj2,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod heat-operator-controller-manager-67dd5f86f5-92hmc_openstack-operators(1854d930-56ec-441f-87d6-821b656cd195): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:37 crc kubenswrapper[4772]: E0320 11:12:37.716787 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" podUID="1854d930-56ec-441f-87d6-821b656cd195" Mar 20 11:12:37 crc kubenswrapper[4772]: E0320 11:12:37.994895 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/heat-operator@sha256:c6ef5db244d874430a56c3cc9d27662e4bd57cdaa489e1f6059abcacf3aa0900\\\"\"" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" podUID="1854d930-56ec-441f-87d6-821b656cd195" Mar 20 11:12:38 crc kubenswrapper[4772]: E0320 11:12:38.341752 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a" Mar 20 11:12:38 crc kubenswrapper[4772]: E0320 11:12:38.341992 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="container &Container{Name:manager,Image:quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a,Command:[/manager],Args:[--leader-elect --health-probe-bind-address=:8081 --metrics-bind-address=127.0.0.1:8080],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LEASE_DURATION,Value:30,ValueFrom:nil,},EnvVar{Name:RENEW_DEADLINE,Value:20,ValueFrom:nil,},EnvVar{Name:RETRY_PERIOD,Value:5,ValueFrom:nil,},EnvVar{Name:ENABLE_WEBHOOKS,Value:false,ValueFrom:nil,},EnvVar{Name:METRICS_CERTS,Value:false,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{cpu: {{500 -3} {} 500m DecimalSI},memory: {{536870912 0} {} BinarySI},},Requests:ResourceList{cpu: {{10 -3} {} 10m DecimalSI},memory: {{268435456 0} {} BinarySI},},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-bstl8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/healthz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:15,TimeoutSeconds:1,PeriodSeconds:20,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 8081 },Host:,Scheme:HTTP,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:5,TimeoutSeconds:1,PeriodSeconds:10,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[MKNOD],},Privileged:nil,SELinuxOptions:nil,RunAsUser:nil,RunAsNonRoot:nil,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod nova-operator-controller-manager-5d488d59fb-jxssq_openstack-operators(546c5d3b-9054-45fb-9e45-95d01b61d012): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:12:38 crc kubenswrapper[4772]: E0320 11:12:38.343187 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" podUID="546c5d3b-9054-45fb-9e45-95d01b61d012" Mar 20 11:12:38 crc kubenswrapper[4772]: I0320 11:12:38.849652 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm"] Mar 20 11:12:38 crc kubenswrapper[4772]: I0320 11:12:38.890334 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244"] Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.002237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" event={"ID":"32c94d79-871d-465b-afe6-e929661093c6","Type":"ContainerStarted","Data":"7ec2ab20b1761d63f13c44097dfc39724f9ea0cc7bd0b471fab03a67f7e720e7"} Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.002326 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.004244 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" event={"ID":"9980fd55-eca0-4c27-a021-59acc8681bfd","Type":"ContainerStarted","Data":"95989444db73de6c9c02c0fe3fcee0658f7d521854729d4979bfeed650827d9f"} Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.004339 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.011998 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" event={"ID":"3f853c54-2e6e-41b9-b15f-3435b17477f2","Type":"ContainerStarted","Data":"7f6d0296efeb0ede2b5ae97b85eaf7b8bccd19fdacda9705a38a90f76af0ef29"} Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.012129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:39 crc kubenswrapper[4772]: E0320 11:12:39.012539 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"manager\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/openstack-k8s-operators/nova-operator@sha256:7398eb8fa5a4844d3326a5dff759d17199870c389b3ce3011a038b27bf95512a\\\"\"" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" podUID="546c5d3b-9054-45fb-9e45-95d01b61d012" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.029086 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" podStartSLOduration=3.156462045 podStartE2EDuration="24.029055459s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.499147061 +0000 UTC m=+1023.590113546" lastFinishedPulling="2026-03-20 11:12:38.371740475 +0000 UTC m=+1044.462706960" observedRunningTime="2026-03-20 11:12:39.021876192 +0000 UTC m=+1045.112842677" watchObservedRunningTime="2026-03-20 11:12:39.029055459 +0000 UTC m=+1045.120021974" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.060103 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" podStartSLOduration=2.19759807 podStartE2EDuration="24.060086463s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.479890463 +0000 UTC m=+1022.570856948" lastFinishedPulling="2026-03-20 11:12:38.342378856 +0000 UTC m=+1044.433345341" observedRunningTime="2026-03-20 11:12:39.04289851 +0000 UTC m=+1045.133865015" watchObservedRunningTime="2026-03-20 11:12:39.060086463 +0000 UTC m=+1045.151052948" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.079232 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" podStartSLOduration=3.224964392 podStartE2EDuration="24.07921487s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.487910782 +0000 UTC m=+1023.578877267" lastFinishedPulling="2026-03-20 11:12:38.34216126 +0000 UTC m=+1044.433127745" observedRunningTime="2026-03-20 11:12:39.074615053 +0000 UTC m=+1045.165581538" watchObservedRunningTime="2026-03-20 11:12:39.07921487 +0000 UTC m=+1045.170181355" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.564408 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.564463 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.564543 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.565179 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3a2dc425dd346ae424a2a128cb64ede7d6abbbfbc7a26799f2508db56e373109"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:12:39 crc kubenswrapper[4772]: I0320 11:12:39.565232 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://3a2dc425dd346ae424a2a128cb64ede7d6abbbfbc7a26799f2508db56e373109" gracePeriod=600 Mar 20 11:12:39 crc kubenswrapper[4772]: W0320 11:12:39.751758 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod48ac539f_199f_49e4_8330_8956df8ea12f.slice/crio-d763cffed08b638249eb59cee5447b393e91d43e00f20b0b113e902d126741e2 WatchSource:0}: Error finding container d763cffed08b638249eb59cee5447b393e91d43e00f20b0b113e902d126741e2: Status 404 returned error can't find the container with id d763cffed08b638249eb59cee5447b393e91d43e00f20b0b113e902d126741e2 Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.019388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" event={"ID":"a1c1c7a4-6840-4e52-b242-7de225eaac97","Type":"ContainerStarted","Data":"37867f3314bbf612140691d111936569ceb252c24786d13ec2d5864806cb67d8"} Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.020733 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" event={"ID":"3b1e9a1f-4847-46ed-9239-11b64f01ef55","Type":"ContainerStarted","Data":"abd2611549f1ae897d43b18f05f464443a1672e85bb795a20b6c8c54faa718a7"} Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.021688 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.025817 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" event={"ID":"ab4a9bd4-c1e1-453d-b586-5089696704fb","Type":"ContainerStarted","Data":"a266a1561f8f831085103d0c7e9c7facaf5c91cc3b8fc2431e91c7372bf676ae"} Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.025922 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.026937 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" event={"ID":"48ac539f-199f-49e4-8330-8956df8ea12f","Type":"ContainerStarted","Data":"d763cffed08b638249eb59cee5447b393e91d43e00f20b0b113e902d126741e2"} Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.029666 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="3a2dc425dd346ae424a2a128cb64ede7d6abbbfbc7a26799f2508db56e373109" exitCode=0 Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.029985 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"3a2dc425dd346ae424a2a128cb64ede7d6abbbfbc7a26799f2508db56e373109"} Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.030027 4772 scope.go:117] "RemoveContainer" containerID="c031238f25d43745bddff1c50d95ad51119ab5adc3b084d1a2d3a9cfa70802a1" Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.039333 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" podStartSLOduration=3.176515208 podStartE2EDuration="25.03932176s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.479462301 +0000 UTC m=+1022.570428786" lastFinishedPulling="2026-03-20 11:12:38.342268853 +0000 UTC m=+1044.433235338" observedRunningTime="2026-03-20 11:12:40.035121674 +0000 UTC m=+1046.126088159" watchObservedRunningTime="2026-03-20 11:12:40.03932176 +0000 UTC m=+1046.130288245" Mar 20 11:12:40 crc kubenswrapper[4772]: I0320 11:12:40.060662 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" podStartSLOduration=3.602621017 podStartE2EDuration="25.060638676s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.885446897 +0000 UTC m=+1022.976413382" lastFinishedPulling="2026-03-20 11:12:38.343464556 +0000 UTC m=+1044.434431041" observedRunningTime="2026-03-20 11:12:40.053541172 +0000 UTC m=+1046.144507657" watchObservedRunningTime="2026-03-20 11:12:40.060638676 +0000 UTC m=+1046.151605161" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.064188 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" event={"ID":"55e1d800-c5ed-4902-bfb3-b36e761a526b","Type":"ContainerStarted","Data":"608ab99012579e59da1a88f989946114ae52a14d820e03d6f6844ca18f1301ef"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.065225 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.074389 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" event={"ID":"48ac539f-199f-49e4-8330-8956df8ea12f","Type":"ContainerStarted","Data":"e719a643712a206618854c7bdcc39a4d180204b66d55da8fd8898f4b7045ad4a"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.075379 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.082530 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" event={"ID":"7ec0f9bb-bb24-4e95-8fb4-734eaee29058","Type":"ContainerStarted","Data":"bb81803dce26ff09f94b5545c9f36d81832fe732a1ca1aef676d93e1bd1003a1"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.083027 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.092099 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" podStartSLOduration=7.155855831 podStartE2EDuration="28.092076397s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.447154269 +0000 UTC m=+1023.538120754" lastFinishedPulling="2026-03-20 11:12:38.383374835 +0000 UTC m=+1044.474341320" observedRunningTime="2026-03-20 11:12:43.084018225 +0000 UTC m=+1049.174984710" watchObservedRunningTime="2026-03-20 11:12:43.092076397 +0000 UTC m=+1049.183042882" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.095624 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"d53ebb3bfe8693516c060197c431334b6d14ac36c11fb07c44a8694176495d3d"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.102514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" event={"ID":"953ed1af-7caa-4fe5-8443-3e5aa1caa77c","Type":"ContainerStarted","Data":"77fcaed439d6b1693565aa2bbdd6154919735765da599c45aedd97bf48f281a7"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.103074 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.106348 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" event={"ID":"e23924f7-44e6-4464-a6d6-240718124df8","Type":"ContainerStarted","Data":"33ff4b30bb92debc844a1e30de3afa6e3b7474ae70036517b7e126c55f78e374"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.106505 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.108912 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" event={"ID":"51140d29-3d51-44e1-a884-ffbad20bbb15","Type":"ContainerStarted","Data":"5b2b5a8c8026df869939ddd62a96f57ea7d515f1c04edc6863ddccaad9548c7c"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.109203 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.113060 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" event={"ID":"5c181b83-da5d-454e-a061-c647bde19d5e","Type":"ContainerStarted","Data":"fe5a947cc683633873f160bf6b4d054cab0f5d9b291ba8a7726e929c972ee2bb"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.115815 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" podStartSLOduration=2.375432716 podStartE2EDuration="28.11579323s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.434672458 +0000 UTC m=+1022.525638943" lastFinishedPulling="2026-03-20 11:12:42.175032972 +0000 UTC m=+1048.265999457" observedRunningTime="2026-03-20 11:12:43.110900195 +0000 UTC m=+1049.201866680" watchObservedRunningTime="2026-03-20 11:12:43.11579323 +0000 UTC m=+1049.206759735" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.116160 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" event={"ID":"574f98d0-6e6f-43b0-a4d1-b13a5d123536","Type":"ContainerStarted","Data":"76e25cec736b1d7b2e850afc163a84b5e1c25e975f1c1ad896fd6b2388415e50"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.116942 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.118250 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" event={"ID":"d85081ab-dafc-467c-9445-9b7b221a56ee","Type":"ContainerStarted","Data":"d0ed1409182aeb21c2f3ad6b04e7027401c615d2c3178436b5038fea9a643ec8"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.118469 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.129215 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" event={"ID":"52caed6a-6dcf-40d0-929c-948d0e421958","Type":"ContainerStarted","Data":"826c1a9c26eac079205ad07ef6c9260056e317a9d3210aba5ecf1662586fe3fc"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.129496 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.138619 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" event={"ID":"127b144a-f395-409d-a0a4-b79b60a60c1f","Type":"ContainerStarted","Data":"529eb6aa122ad916954c79c366d1866d98112343306c8e48b919b5c918343c5c"} Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.139243 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.162660 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" podStartSLOduration=27.16264133 podStartE2EDuration="27.16264133s" podCreationTimestamp="2026-03-20 11:12:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:12:43.151079712 +0000 UTC m=+1049.242046197" watchObservedRunningTime="2026-03-20 11:12:43.16264133 +0000 UTC m=+1049.253607815" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.180231 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" podStartSLOduration=3.780981437 podStartE2EDuration="28.180211833s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.619980197 +0000 UTC m=+1023.710946682" lastFinishedPulling="2026-03-20 11:12:42.019210593 +0000 UTC m=+1048.110177078" observedRunningTime="2026-03-20 11:12:43.178409743 +0000 UTC m=+1049.269376228" watchObservedRunningTime="2026-03-20 11:12:43.180211833 +0000 UTC m=+1049.271178338" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.201688 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" podStartSLOduration=6.8158912130000004 podStartE2EDuration="28.201664274s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.986036116 +0000 UTC m=+1023.077002591" lastFinishedPulling="2026-03-20 11:12:38.371809167 +0000 UTC m=+1044.462775652" observedRunningTime="2026-03-20 11:12:43.193784987 +0000 UTC m=+1049.284751472" watchObservedRunningTime="2026-03-20 11:12:43.201664274 +0000 UTC m=+1049.292630759" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.230606 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" podStartSLOduration=3.775567368 podStartE2EDuration="28.23058393s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.613055806 +0000 UTC m=+1023.704022291" lastFinishedPulling="2026-03-20 11:12:42.068072368 +0000 UTC m=+1048.159038853" observedRunningTime="2026-03-20 11:12:43.224466852 +0000 UTC m=+1049.315433337" watchObservedRunningTime="2026-03-20 11:12:43.23058393 +0000 UTC m=+1049.321550415" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.250206 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" podStartSLOduration=3.734117436 podStartE2EDuration="28.250181029s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.526121113 +0000 UTC m=+1023.617087598" lastFinishedPulling="2026-03-20 11:12:42.042184706 +0000 UTC m=+1048.133151191" observedRunningTime="2026-03-20 11:12:43.244588425 +0000 UTC m=+1049.335554930" watchObservedRunningTime="2026-03-20 11:12:43.250181029 +0000 UTC m=+1049.341147514" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.266583 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" podStartSLOduration=3.700914784 podStartE2EDuration="28.26655909s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.505678471 +0000 UTC m=+1023.596644956" lastFinishedPulling="2026-03-20 11:12:42.071322777 +0000 UTC m=+1048.162289262" observedRunningTime="2026-03-20 11:12:43.264334809 +0000 UTC m=+1049.355301284" watchObservedRunningTime="2026-03-20 11:12:43.26655909 +0000 UTC m=+1049.357525575" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.280130 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" podStartSLOduration=7.337252655 podStartE2EDuration="28.280113533s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.399295972 +0000 UTC m=+1023.490262457" lastFinishedPulling="2026-03-20 11:12:38.34215684 +0000 UTC m=+1044.433123335" observedRunningTime="2026-03-20 11:12:43.2774394 +0000 UTC m=+1049.368405885" watchObservedRunningTime="2026-03-20 11:12:43.280113533 +0000 UTC m=+1049.371080018" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.313773 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" podStartSLOduration=7.159967984 podStartE2EDuration="28.31375289s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.218197736 +0000 UTC m=+1023.309164221" lastFinishedPulling="2026-03-20 11:12:38.371982642 +0000 UTC m=+1044.462949127" observedRunningTime="2026-03-20 11:12:43.31232557 +0000 UTC m=+1049.403292085" watchObservedRunningTime="2026-03-20 11:12:43.31375289 +0000 UTC m=+1049.404719385" Mar 20 11:12:43 crc kubenswrapper[4772]: I0320 11:12:43.357241 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" podStartSLOduration=6.888388018 podStartE2EDuration="28.357215515s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.875652337 +0000 UTC m=+1022.966618822" lastFinishedPulling="2026-03-20 11:12:38.344479834 +0000 UTC m=+1044.435446319" observedRunningTime="2026-03-20 11:12:43.346439149 +0000 UTC m=+1049.437405634" watchObservedRunningTime="2026-03-20 11:12:43.357215515 +0000 UTC m=+1049.448182000" Mar 20 11:12:44 crc kubenswrapper[4772]: I0320 11:12:44.147965 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.153809 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" event={"ID":"e6248bcf-f076-4165-a9c7-0239c16e980d","Type":"ContainerStarted","Data":"5e4fc469f03fac99d5b5115bb9f94a643ddad7fc28b3bf2b20ed81fa8ee3c527"} Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.154310 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.155837 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" event={"ID":"a1c1c7a4-6840-4e52-b242-7de225eaac97","Type":"ContainerStarted","Data":"19d950ba974a743053776c3aa2671d338e4a3573aa5815782dc5719253ff1a14"} Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.178694 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" podStartSLOduration=2.483908291 podStartE2EDuration="30.178678907s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.021385609 +0000 UTC m=+1023.112352094" lastFinishedPulling="2026-03-20 11:12:44.716156225 +0000 UTC m=+1050.807122710" observedRunningTime="2026-03-20 11:12:45.174117962 +0000 UTC m=+1051.265084467" watchObservedRunningTime="2026-03-20 11:12:45.178678907 +0000 UTC m=+1051.269645382" Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.211557 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" podStartSLOduration=25.279019208 podStartE2EDuration="30.211533402s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:39.78468597 +0000 UTC m=+1045.875652455" lastFinishedPulling="2026-03-20 11:12:44.717200164 +0000 UTC m=+1050.808166649" observedRunningTime="2026-03-20 11:12:45.196914519 +0000 UTC m=+1051.287881004" watchObservedRunningTime="2026-03-20 11:12:45.211533402 +0000 UTC m=+1051.302499887" Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.716494 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/barbican-operator-controller-manager-59bc569d95-t4z2d" Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.767669 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/designate-operator-controller-manager-588d4d986b-62t9t" Mar 20 11:12:45 crc kubenswrapper[4772]: I0320 11:12:45.820616 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-8464cc45fb-kh9lg" Mar 20 11:12:46 crc kubenswrapper[4772]: I0320 11:12:46.162057 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:46 crc kubenswrapper[4772]: I0320 11:12:46.306275 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/placement-operator-controller-manager-5784578c99-jtzxg" Mar 20 11:12:46 crc kubenswrapper[4772]: I0320 11:12:46.479674 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/watcher-operator-controller-manager-6c4d75f7f9-f488c" Mar 20 11:12:47 crc kubenswrapper[4772]: I0320 11:12:47.436587 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:47 crc kubenswrapper[4772]: I0320 11:12:47.442502 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/a7308ded-ef41-499c-ae52-13d9e32b51e1-cert\") pod \"infra-operator-controller-manager-669fff9c7c-7tx6m\" (UID: \"a7308ded-ef41-499c-ae52-13d9e32b51e1\") " pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:47 crc kubenswrapper[4772]: I0320 11:12:47.676161 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:48 crc kubenswrapper[4772]: I0320 11:12:48.118793 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m"] Mar 20 11:12:48 crc kubenswrapper[4772]: W0320 11:12:48.128708 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7308ded_ef41_499c_ae52_13d9e32b51e1.slice/crio-fda08550de0d433d4285cced68de36d81071e45efcbab53f72ea91c2d0864f0a WatchSource:0}: Error finding container fda08550de0d433d4285cced68de36d81071e45efcbab53f72ea91c2d0864f0a: Status 404 returned error can't find the container with id fda08550de0d433d4285cced68de36d81071e45efcbab53f72ea91c2d0864f0a Mar 20 11:12:48 crc kubenswrapper[4772]: I0320 11:12:48.177507 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" event={"ID":"a7308ded-ef41-499c-ae52-13d9e32b51e1","Type":"ContainerStarted","Data":"fda08550de0d433d4285cced68de36d81071e45efcbab53f72ea91c2d0864f0a"} Mar 20 11:12:50 crc kubenswrapper[4772]: I0320 11:12:50.191953 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" event={"ID":"a7308ded-ef41-499c-ae52-13d9e32b51e1","Type":"ContainerStarted","Data":"a4dbb219fb3767ef1e9e269063b1e226917f0efded9b31f07439f4f99d04d645"} Mar 20 11:12:50 crc kubenswrapper[4772]: I0320 11:12:50.192421 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:12:50 crc kubenswrapper[4772]: I0320 11:12:50.213747 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" podStartSLOduration=33.653471572 podStartE2EDuration="35.213730373s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:48.131920455 +0000 UTC m=+1054.222886940" lastFinishedPulling="2026-03-20 11:12:49.692179256 +0000 UTC m=+1055.783145741" observedRunningTime="2026-03-20 11:12:50.211226644 +0000 UTC m=+1056.302193139" watchObservedRunningTime="2026-03-20 11:12:50.213730373 +0000 UTC m=+1056.304696858" Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.199697 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" event={"ID":"f6d09f24-ca68-486c-8fb6-e34e3172077a","Type":"ContainerStarted","Data":"f71cc7446135729d5367a3c64f9ef9ef63320b3e6dc0d4e9157d2ce344ab6850"} Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.199988 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.201301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" event={"ID":"546c5d3b-9054-45fb-9e45-95d01b61d012","Type":"ContainerStarted","Data":"39bce5de1f9347d32c33fb34dca35ec521b57b137916f80b35224dee9ec53348"} Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.201515 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.219335 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" podStartSLOduration=2.589973192 podStartE2EDuration="36.219318875s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.479752849 +0000 UTC m=+1022.570719334" lastFinishedPulling="2026-03-20 11:12:50.109098532 +0000 UTC m=+1056.200065017" observedRunningTime="2026-03-20 11:12:51.218259786 +0000 UTC m=+1057.309226271" watchObservedRunningTime="2026-03-20 11:12:51.219318875 +0000 UTC m=+1057.310285360" Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.235386 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" podStartSLOduration=3.5792139929999998 podStartE2EDuration="36.235367517s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:17.499549192 +0000 UTC m=+1023.590515677" lastFinishedPulling="2026-03-20 11:12:50.155702716 +0000 UTC m=+1056.246669201" observedRunningTime="2026-03-20 11:12:51.230775121 +0000 UTC m=+1057.321741606" watchObservedRunningTime="2026-03-20 11:12:51.235367517 +0000 UTC m=+1057.326334002" Mar 20 11:12:51 crc kubenswrapper[4772]: I0320 11:12:51.971565 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-baremetal-operator-controller-manager-89d64c458-cnssm" Mar 20 11:12:52 crc kubenswrapper[4772]: I0320 11:12:52.162172 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/openstack-operator-controller-manager-6697dffbc-xp244" Mar 20 11:12:53 crc kubenswrapper[4772]: I0320 11:12:53.217230 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" event={"ID":"1854d930-56ec-441f-87d6-821b656cd195","Type":"ContainerStarted","Data":"88502cab07873a591e3399153cedd6123cc64953cda96d0268cb054f00e93846"} Mar 20 11:12:54 crc kubenswrapper[4772]: I0320 11:12:54.223920 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:12:54 crc kubenswrapper[4772]: I0320 11:12:54.241507 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" podStartSLOduration=3.05200709 podStartE2EDuration="39.241489249s" podCreationTimestamp="2026-03-20 11:12:15 +0000 UTC" firstStartedPulling="2026-03-20 11:12:16.886689611 +0000 UTC m=+1022.977656096" lastFinishedPulling="2026-03-20 11:12:53.07617177 +0000 UTC m=+1059.167138255" observedRunningTime="2026-03-20 11:12:54.239379162 +0000 UTC m=+1060.330345647" watchObservedRunningTime="2026-03-20 11:12:54.241489249 +0000 UTC m=+1060.332455724" Mar 20 11:12:55 crc kubenswrapper[4772]: I0320 11:12:55.734670 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/cinder-operator-controller-manager-8d58dc466-hlkjp" Mar 20 11:12:55 crc kubenswrapper[4772]: I0320 11:12:55.773596 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-79df6bcc97-224ts" Mar 20 11:12:55 crc kubenswrapper[4772]: I0320 11:12:55.923820 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ironic-operator-controller-manager-6f787dddc9-tcjvl" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.016142 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-768b96df4c-g8thw" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.049129 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/manila-operator-controller-manager-55f864c847-bvmfv" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.082681 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-67ccfc9778-qcwm6" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.108015 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/neutron-operator-controller-manager-767865f676-wjc8s" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.193920 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/nova-operator-controller-manager-5d488d59fb-jxssq" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.252210 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/octavia-operator-controller-manager-5b9f45d989-7p7tf" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.328987 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/ovn-operator-controller-manager-884679f54-qdjbq" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.349202 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-c674c5965-lz8q4" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.397509 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/telemetry-operator-controller-manager-d6b694c5-nb5t9" Mar 20 11:12:56 crc kubenswrapper[4772]: I0320 11:12:56.425723 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/test-operator-controller-manager-5c5cb9c4d7-s5gs7" Mar 20 11:12:57 crc kubenswrapper[4772]: I0320 11:12:57.683520 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-669fff9c7c-7tx6m" Mar 20 11:13:05 crc kubenswrapper[4772]: I0320 11:13:05.792924 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/heat-operator-controller-manager-67dd5f86f5-92hmc" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.429230 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gj8ln"] Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.431682 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.436146 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.436222 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openstack"/"dnsmasq-dns-dockercfg-x5xjq" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.436155 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"kube-root-ca.crt" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.437124 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"openshift-service-ca.crt" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.484111 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tkq8w"] Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.485346 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.492911 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gj8ln"] Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.493581 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack"/"dns-svc" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.504786 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tkq8w"] Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.553957 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed0d482-4eee-44dc-9f3c-74fab49dd624-config\") pod \"dnsmasq-dns-675f4bcbfc-gj8ln\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.554026 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nm4bd\" (UniqueName: \"kubernetes.io/projected/69d23241-cdf8-4417-bb47-d5541e49fb12-kube-api-access-nm4bd\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.554072 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4vnl\" (UniqueName: \"kubernetes.io/projected/aed0d482-4eee-44dc-9f3c-74fab49dd624-kube-api-access-k4vnl\") pod \"dnsmasq-dns-675f4bcbfc-gj8ln\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.554100 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d23241-cdf8-4417-bb47-d5541e49fb12-config\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.554154 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d23241-cdf8-4417-bb47-d5541e49fb12-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.655566 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d23241-cdf8-4417-bb47-d5541e49fb12-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.655650 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed0d482-4eee-44dc-9f3c-74fab49dd624-config\") pod \"dnsmasq-dns-675f4bcbfc-gj8ln\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.655704 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nm4bd\" (UniqueName: \"kubernetes.io/projected/69d23241-cdf8-4417-bb47-d5541e49fb12-kube-api-access-nm4bd\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.655744 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4vnl\" (UniqueName: \"kubernetes.io/projected/aed0d482-4eee-44dc-9f3c-74fab49dd624-kube-api-access-k4vnl\") pod \"dnsmasq-dns-675f4bcbfc-gj8ln\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.655774 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d23241-cdf8-4417-bb47-d5541e49fb12-config\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.656874 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/69d23241-cdf8-4417-bb47-d5541e49fb12-config\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.657524 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dns-svc\" (UniqueName: \"kubernetes.io/configmap/69d23241-cdf8-4417-bb47-d5541e49fb12-dns-svc\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.658828 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed0d482-4eee-44dc-9f3c-74fab49dd624-config\") pod \"dnsmasq-dns-675f4bcbfc-gj8ln\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.679317 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nm4bd\" (UniqueName: \"kubernetes.io/projected/69d23241-cdf8-4417-bb47-d5541e49fb12-kube-api-access-nm4bd\") pod \"dnsmasq-dns-78dd6ddcc-tkq8w\" (UID: \"69d23241-cdf8-4417-bb47-d5541e49fb12\") " pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.679375 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4vnl\" (UniqueName: \"kubernetes.io/projected/aed0d482-4eee-44dc-9f3c-74fab49dd624-kube-api-access-k4vnl\") pod \"dnsmasq-dns-675f4bcbfc-gj8ln\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.753086 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:23 crc kubenswrapper[4772]: I0320 11:13:23.802517 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:24 crc kubenswrapper[4772]: I0320 11:13:24.244208 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gj8ln"] Mar 20 11:13:24 crc kubenswrapper[4772]: I0320 11:13:24.320526 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack/dnsmasq-dns-78dd6ddcc-tkq8w"] Mar 20 11:13:24 crc kubenswrapper[4772]: W0320 11:13:24.326023 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod69d23241_cdf8_4417_bb47_d5541e49fb12.slice/crio-7e31eb9d24d612151c9266764a6bfdfc0c897d29058597678c88f13391a0ebb8 WatchSource:0}: Error finding container 7e31eb9d24d612151c9266764a6bfdfc0c897d29058597678c88f13391a0ebb8: Status 404 returned error can't find the container with id 7e31eb9d24d612151c9266764a6bfdfc0c897d29058597678c88f13391a0ebb8 Mar 20 11:13:24 crc kubenswrapper[4772]: I0320 11:13:24.425740 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" event={"ID":"aed0d482-4eee-44dc-9f3c-74fab49dd624","Type":"ContainerStarted","Data":"234d7d2e63e9fbf41aad574d2470a85caa4ba551684ea56bcf73b54bac3c94b0"} Mar 20 11:13:24 crc kubenswrapper[4772]: I0320 11:13:24.426976 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" event={"ID":"69d23241-cdf8-4417-bb47-d5541e49fb12","Type":"ContainerStarted","Data":"7e31eb9d24d612151c9266764a6bfdfc0c897d29058597678c88f13391a0ebb8"} Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.098826 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.099689 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:ndfhb5h667h568h584h5f9h58dh565h664h587h597h577h64bh5c4h66fh647hbdh68ch5c5h68dh686h5f7h64hd7hc6h55fh57bh98h57fh87h5fh57fq,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:dns-svc,ReadOnly:true,MountPath:/etc/dnsmasq.d/hosts/dns-svc,SubPath:dns-svc,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-nm4bd,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-78dd6ddcc-tkq8w_openstack(69d23241-cdf8-4417-bb47-d5541e49fb12): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.100793 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" podUID="69d23241-cdf8-4417-bb47-d5541e49fb12" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.106922 4772 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying config: context canceled" image="quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.107037 4772 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:init,Image:quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified,Command:[/bin/bash],Args:[-c dnsmasq --interface=* --conf-dir=/etc/dnsmasq.d --hostsdir=/etc/dnsmasq.d/hosts --keep-in-foreground --log-debug --bind-interfaces --listen-address=$(POD_IP) --port 5353 --log-facility=- --no-hosts --domain-needed --no-resolv --bogus-priv --log-queries --test],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:CONFIG_HASH,Value:nffh5bdhf4h5f8h79h55h77h58fh56dh7bh6fh578hbch55dh68h56bhd9h65dh57ch658hc9h566h666h688h58h65dh684h5d7h6ch575h5d6h88q,ValueFrom:nil,},EnvVar{Name:POD_IP,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:status.podIP,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/etc/dnsmasq.d/config.cfg,SubPath:dns,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-k4vnl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000650000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod dnsmasq-dns-675f4bcbfc-gj8ln_openstack(aed0d482-4eee-44dc-9f3c-74fab49dd624): ErrImagePull: rpc error: code = Canceled desc = copying config: context canceled" logger="UnhandledError" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.108116 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ErrImagePull: \"rpc error: code = Canceled desc = copying config: context canceled\"" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.575323 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" Mar 20 11:13:38 crc kubenswrapper[4772]: E0320 11:13:38.575764 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"init\" with ImagePullBackOff: \"Back-off pulling image \\\"quay.io/podified-antelope-centos9/openstack-neutron-server:current-podified\\\"\"" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" podUID="69d23241-cdf8-4417-bb47-d5541e49fb12" Mar 20 11:13:50 crc kubenswrapper[4772]: I0320 11:13:50.659885 4772 generic.go:334] "Generic (PLEG): container finished" podID="69d23241-cdf8-4417-bb47-d5541e49fb12" containerID="0d7e5b546039b072922cd6a7380ee8de708f2fdba99a990c81fe9363a062559d" exitCode=0 Mar 20 11:13:50 crc kubenswrapper[4772]: I0320 11:13:50.660613 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" event={"ID":"69d23241-cdf8-4417-bb47-d5541e49fb12","Type":"ContainerDied","Data":"0d7e5b546039b072922cd6a7380ee8de708f2fdba99a990c81fe9363a062559d"} Mar 20 11:13:51 crc kubenswrapper[4772]: I0320 11:13:51.671818 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" event={"ID":"69d23241-cdf8-4417-bb47-d5541e49fb12","Type":"ContainerStarted","Data":"6b30d44d46da15a1dfb2332850d54ab151a34c9460c2ab287a84fec5d0414480"} Mar 20 11:13:51 crc kubenswrapper[4772]: I0320 11:13:51.673059 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:51 crc kubenswrapper[4772]: I0320 11:13:51.692205 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" podStartSLOduration=2.787438261 podStartE2EDuration="28.692186151s" podCreationTimestamp="2026-03-20 11:13:23 +0000 UTC" firstStartedPulling="2026-03-20 11:13:24.328646234 +0000 UTC m=+1090.419612719" lastFinishedPulling="2026-03-20 11:13:50.233394124 +0000 UTC m=+1116.324360609" observedRunningTime="2026-03-20 11:13:51.687130223 +0000 UTC m=+1117.778096728" watchObservedRunningTime="2026-03-20 11:13:51.692186151 +0000 UTC m=+1117.783152636" Mar 20 11:13:53 crc kubenswrapper[4772]: I0320 11:13:53.690049 4772 generic.go:334] "Generic (PLEG): container finished" podID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerID="355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f" exitCode=0 Mar 20 11:13:53 crc kubenswrapper[4772]: I0320 11:13:53.690141 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" event={"ID":"aed0d482-4eee-44dc-9f3c-74fab49dd624","Type":"ContainerDied","Data":"355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f"} Mar 20 11:13:54 crc kubenswrapper[4772]: I0320 11:13:54.700222 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" event={"ID":"aed0d482-4eee-44dc-9f3c-74fab49dd624","Type":"ContainerStarted","Data":"a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a"} Mar 20 11:13:54 crc kubenswrapper[4772]: I0320 11:13:54.700458 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:54 crc kubenswrapper[4772]: I0320 11:13:54.719303 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" podStartSLOduration=-9223372005.13549 podStartE2EDuration="31.719285372s" podCreationTimestamp="2026-03-20 11:13:23 +0000 UTC" firstStartedPulling="2026-03-20 11:13:24.249953057 +0000 UTC m=+1090.340919542" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-03-20 11:13:54.716553857 +0000 UTC m=+1120.807520362" watchObservedRunningTime="2026-03-20 11:13:54.719285372 +0000 UTC m=+1120.810251847" Mar 20 11:13:58 crc kubenswrapper[4772]: I0320 11:13:58.754042 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:13:58 crc kubenswrapper[4772]: I0320 11:13:58.804723 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack/dnsmasq-dns-78dd6ddcc-tkq8w" Mar 20 11:13:58 crc kubenswrapper[4772]: I0320 11:13:58.859387 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gj8ln"] Mar 20 11:13:59 crc kubenswrapper[4772]: I0320 11:13:59.757179 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerName="dnsmasq-dns" containerID="cri-o://a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a" gracePeriod=10 Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.145366 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566754-rhmjv"] Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.148125 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.149897 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.150477 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.150537 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.150688 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.152642 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-rhmjv"] Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.202023 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4vnl\" (UniqueName: \"kubernetes.io/projected/aed0d482-4eee-44dc-9f3c-74fab49dd624-kube-api-access-k4vnl\") pod \"aed0d482-4eee-44dc-9f3c-74fab49dd624\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.202111 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed0d482-4eee-44dc-9f3c-74fab49dd624-config\") pod \"aed0d482-4eee-44dc-9f3c-74fab49dd624\" (UID: \"aed0d482-4eee-44dc-9f3c-74fab49dd624\") " Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.202574 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27xb7\" (UniqueName: \"kubernetes.io/projected/940f6073-4e13-42da-91e7-bf055f8ecd94-kube-api-access-27xb7\") pod \"auto-csr-approver-29566754-rhmjv\" (UID: \"940f6073-4e13-42da-91e7-bf055f8ecd94\") " pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.209627 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aed0d482-4eee-44dc-9f3c-74fab49dd624-kube-api-access-k4vnl" (OuterVolumeSpecName: "kube-api-access-k4vnl") pod "aed0d482-4eee-44dc-9f3c-74fab49dd624" (UID: "aed0d482-4eee-44dc-9f3c-74fab49dd624"). InnerVolumeSpecName "kube-api-access-k4vnl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.244549 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/aed0d482-4eee-44dc-9f3c-74fab49dd624-config" (OuterVolumeSpecName: "config") pod "aed0d482-4eee-44dc-9f3c-74fab49dd624" (UID: "aed0d482-4eee-44dc-9f3c-74fab49dd624"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.304629 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27xb7\" (UniqueName: \"kubernetes.io/projected/940f6073-4e13-42da-91e7-bf055f8ecd94-kube-api-access-27xb7\") pod \"auto-csr-approver-29566754-rhmjv\" (UID: \"940f6073-4e13-42da-91e7-bf055f8ecd94\") " pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.305086 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4vnl\" (UniqueName: \"kubernetes.io/projected/aed0d482-4eee-44dc-9f3c-74fab49dd624-kube-api-access-k4vnl\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.305178 4772 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/aed0d482-4eee-44dc-9f3c-74fab49dd624-config\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.321022 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27xb7\" (UniqueName: \"kubernetes.io/projected/940f6073-4e13-42da-91e7-bf055f8ecd94-kube-api-access-27xb7\") pod \"auto-csr-approver-29566754-rhmjv\" (UID: \"940f6073-4e13-42da-91e7-bf055f8ecd94\") " pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.467814 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.765207 4772 generic.go:334] "Generic (PLEG): container finished" podID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerID="a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a" exitCode=0 Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.765270 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.765309 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" event={"ID":"aed0d482-4eee-44dc-9f3c-74fab49dd624","Type":"ContainerDied","Data":"a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a"} Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.765719 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack/dnsmasq-dns-675f4bcbfc-gj8ln" event={"ID":"aed0d482-4eee-44dc-9f3c-74fab49dd624","Type":"ContainerDied","Data":"234d7d2e63e9fbf41aad574d2470a85caa4ba551684ea56bcf73b54bac3c94b0"} Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.765744 4772 scope.go:117] "RemoveContainer" containerID="a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.787087 4772 scope.go:117] "RemoveContainer" containerID="355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.787103 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gj8ln"] Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.791913 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack/dnsmasq-dns-675f4bcbfc-gj8ln"] Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.801928 4772 scope.go:117] "RemoveContainer" containerID="a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a" Mar 20 11:14:00 crc kubenswrapper[4772]: E0320 11:14:00.802722 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a\": container with ID starting with a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a not found: ID does not exist" containerID="a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.802759 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a"} err="failed to get container status \"a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a\": rpc error: code = NotFound desc = could not find container \"a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a\": container with ID starting with a6725fe84f1669ee4b8610080373cacaea8929d197389213c4115d0aa775f76a not found: ID does not exist" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.802792 4772 scope.go:117] "RemoveContainer" containerID="355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f" Mar 20 11:14:00 crc kubenswrapper[4772]: E0320 11:14:00.803197 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f\": container with ID starting with 355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f not found: ID does not exist" containerID="355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.803220 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f"} err="failed to get container status \"355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f\": rpc error: code = NotFound desc = could not find container \"355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f\": container with ID starting with 355defd92006b8d7c70aaecb98cce7f18a81eb5604f5e98e0608274d237dfc6f not found: ID does not exist" Mar 20 11:14:00 crc kubenswrapper[4772]: I0320 11:14:00.860892 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-rhmjv"] Mar 20 11:14:01 crc kubenswrapper[4772]: I0320 11:14:01.774563 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" event={"ID":"940f6073-4e13-42da-91e7-bf055f8ecd94","Type":"ContainerStarted","Data":"b11bd4aae449ddae488eb7cdc78d13ff644ff23ac25ddffd8d85d4572a0e5c49"} Mar 20 11:14:02 crc kubenswrapper[4772]: I0320 11:14:02.651076 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" path="/var/lib/kubelet/pods/aed0d482-4eee-44dc-9f3c-74fab49dd624/volumes" Mar 20 11:14:02 crc kubenswrapper[4772]: I0320 11:14:02.784594 4772 generic.go:334] "Generic (PLEG): container finished" podID="940f6073-4e13-42da-91e7-bf055f8ecd94" containerID="36051500080ff917f5010cf28907ec2b464c24ebe5dd94320668dd7b033d1771" exitCode=0 Mar 20 11:14:02 crc kubenswrapper[4772]: I0320 11:14:02.784667 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" event={"ID":"940f6073-4e13-42da-91e7-bf055f8ecd94","Type":"ContainerDied","Data":"36051500080ff917f5010cf28907ec2b464c24ebe5dd94320668dd7b033d1771"} Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.046012 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.165097 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27xb7\" (UniqueName: \"kubernetes.io/projected/940f6073-4e13-42da-91e7-bf055f8ecd94-kube-api-access-27xb7\") pod \"940f6073-4e13-42da-91e7-bf055f8ecd94\" (UID: \"940f6073-4e13-42da-91e7-bf055f8ecd94\") " Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.181326 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/940f6073-4e13-42da-91e7-bf055f8ecd94-kube-api-access-27xb7" (OuterVolumeSpecName: "kube-api-access-27xb7") pod "940f6073-4e13-42da-91e7-bf055f8ecd94" (UID: "940f6073-4e13-42da-91e7-bf055f8ecd94"). InnerVolumeSpecName "kube-api-access-27xb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.267586 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27xb7\" (UniqueName: \"kubernetes.io/projected/940f6073-4e13-42da-91e7-bf055f8ecd94-kube-api-access-27xb7\") on node \"crc\" DevicePath \"\"" Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.798541 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" event={"ID":"940f6073-4e13-42da-91e7-bf055f8ecd94","Type":"ContainerDied","Data":"b11bd4aae449ddae488eb7cdc78d13ff644ff23ac25ddffd8d85d4572a0e5c49"} Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.798579 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b11bd4aae449ddae488eb7cdc78d13ff644ff23ac25ddffd8d85d4572a0e5c49" Mar 20 11:14:04 crc kubenswrapper[4772]: I0320 11:14:04.798904 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566754-rhmjv" Mar 20 11:14:05 crc kubenswrapper[4772]: I0320 11:14:05.111904 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-dzlb9"] Mar 20 11:14:05 crc kubenswrapper[4772]: I0320 11:14:05.118626 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566748-dzlb9"] Mar 20 11:14:06 crc kubenswrapper[4772]: I0320 11:14:06.650262 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3bb2e471-12ed-447a-87ec-551878c46fea" path="/var/lib/kubelet/pods/3bb2e471-12ed-447a-87ec-551878c46fea/volumes" Mar 20 11:14:38 crc kubenswrapper[4772]: I0320 11:14:38.449819 4772 scope.go:117] "RemoveContainer" containerID="084311825a5bcf13e2fa31bc1f33e2dae25a1b53f78a0f645b4ace1d4f9c250f" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.144002 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr"] Mar 20 11:15:00 crc kubenswrapper[4772]: E0320 11:15:00.144782 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="940f6073-4e13-42da-91e7-bf055f8ecd94" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.144794 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="940f6073-4e13-42da-91e7-bf055f8ecd94" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4772]: E0320 11:15:00.144807 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerName="init" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.144814 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerName="init" Mar 20 11:15:00 crc kubenswrapper[4772]: E0320 11:15:00.144863 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerName="dnsmasq-dns" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.144874 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerName="dnsmasq-dns" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.145047 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="940f6073-4e13-42da-91e7-bf055f8ecd94" containerName="oc" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.145064 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aed0d482-4eee-44dc-9f3c-74fab49dd624" containerName="dnsmasq-dns" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.145572 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.147975 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.149473 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.155806 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr"] Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.172786 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-czkt7\" (UniqueName: \"kubernetes.io/projected/906c698f-c498-4f6a-89a2-18bbee45ffa8-kube-api-access-czkt7\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.172831 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/906c698f-c498-4f6a-89a2-18bbee45ffa8-secret-volume\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.172901 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/906c698f-c498-4f6a-89a2-18bbee45ffa8-config-volume\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.274814 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/906c698f-c498-4f6a-89a2-18bbee45ffa8-config-volume\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.274930 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-czkt7\" (UniqueName: \"kubernetes.io/projected/906c698f-c498-4f6a-89a2-18bbee45ffa8-kube-api-access-czkt7\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.274959 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/906c698f-c498-4f6a-89a2-18bbee45ffa8-secret-volume\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.275722 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/906c698f-c498-4f6a-89a2-18bbee45ffa8-config-volume\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.280782 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/906c698f-c498-4f6a-89a2-18bbee45ffa8-secret-volume\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.292801 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-czkt7\" (UniqueName: \"kubernetes.io/projected/906c698f-c498-4f6a-89a2-18bbee45ffa8-kube-api-access-czkt7\") pod \"collect-profiles-29566755-pgssr\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.471572 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:00 crc kubenswrapper[4772]: I0320 11:15:00.886704 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr"] Mar 20 11:15:01 crc kubenswrapper[4772]: I0320 11:15:01.203969 4772 generic.go:334] "Generic (PLEG): container finished" podID="906c698f-c498-4f6a-89a2-18bbee45ffa8" containerID="3c5c01948b16896111842977c03313419d53b8b9752cc6f3a1f986d986c0d152" exitCode=0 Mar 20 11:15:01 crc kubenswrapper[4772]: I0320 11:15:01.204039 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" event={"ID":"906c698f-c498-4f6a-89a2-18bbee45ffa8","Type":"ContainerDied","Data":"3c5c01948b16896111842977c03313419d53b8b9752cc6f3a1f986d986c0d152"} Mar 20 11:15:01 crc kubenswrapper[4772]: I0320 11:15:01.204112 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" event={"ID":"906c698f-c498-4f6a-89a2-18bbee45ffa8","Type":"ContainerStarted","Data":"f3e68f067803a72131c6b8d27c9586c7b91ef4093eeb2f3c6e78f70d6e809bd6"} Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.479358 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.509008 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/906c698f-c498-4f6a-89a2-18bbee45ffa8-secret-volume\") pod \"906c698f-c498-4f6a-89a2-18bbee45ffa8\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.509102 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/906c698f-c498-4f6a-89a2-18bbee45ffa8-config-volume\") pod \"906c698f-c498-4f6a-89a2-18bbee45ffa8\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.509137 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-czkt7\" (UniqueName: \"kubernetes.io/projected/906c698f-c498-4f6a-89a2-18bbee45ffa8-kube-api-access-czkt7\") pod \"906c698f-c498-4f6a-89a2-18bbee45ffa8\" (UID: \"906c698f-c498-4f6a-89a2-18bbee45ffa8\") " Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.510256 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/906c698f-c498-4f6a-89a2-18bbee45ffa8-config-volume" (OuterVolumeSpecName: "config-volume") pod "906c698f-c498-4f6a-89a2-18bbee45ffa8" (UID: "906c698f-c498-4f6a-89a2-18bbee45ffa8"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.515925 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/906c698f-c498-4f6a-89a2-18bbee45ffa8-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "906c698f-c498-4f6a-89a2-18bbee45ffa8" (UID: "906c698f-c498-4f6a-89a2-18bbee45ffa8"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.516015 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/906c698f-c498-4f6a-89a2-18bbee45ffa8-kube-api-access-czkt7" (OuterVolumeSpecName: "kube-api-access-czkt7") pod "906c698f-c498-4f6a-89a2-18bbee45ffa8" (UID: "906c698f-c498-4f6a-89a2-18bbee45ffa8"). InnerVolumeSpecName "kube-api-access-czkt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.610739 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/906c698f-c498-4f6a-89a2-18bbee45ffa8-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.610795 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/906c698f-c498-4f6a-89a2-18bbee45ffa8-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:02 crc kubenswrapper[4772]: I0320 11:15:02.610811 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-czkt7\" (UniqueName: \"kubernetes.io/projected/906c698f-c498-4f6a-89a2-18bbee45ffa8-kube-api-access-czkt7\") on node \"crc\" DevicePath \"\"" Mar 20 11:15:03 crc kubenswrapper[4772]: I0320 11:15:03.221868 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" event={"ID":"906c698f-c498-4f6a-89a2-18bbee45ffa8","Type":"ContainerDied","Data":"f3e68f067803a72131c6b8d27c9586c7b91ef4093eeb2f3c6e78f70d6e809bd6"} Mar 20 11:15:03 crc kubenswrapper[4772]: I0320 11:15:03.222219 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f3e68f067803a72131c6b8d27c9586c7b91ef4093eeb2f3c6e78f70d6e809bd6" Mar 20 11:15:03 crc kubenswrapper[4772]: I0320 11:15:03.221927 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566755-pgssr" Mar 20 11:15:09 crc kubenswrapper[4772]: I0320 11:15:09.564318 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:15:09 crc kubenswrapper[4772]: I0320 11:15:09.565232 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:15:39 crc kubenswrapper[4772]: I0320 11:15:39.564324 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:15:39 crc kubenswrapper[4772]: I0320 11:15:39.565244 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.149375 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566756-qwnl7"] Mar 20 11:16:00 crc kubenswrapper[4772]: E0320 11:16:00.150974 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="906c698f-c498-4f6a-89a2-18bbee45ffa8" containerName="collect-profiles" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.150998 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="906c698f-c498-4f6a-89a2-18bbee45ffa8" containerName="collect-profiles" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.151210 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="906c698f-c498-4f6a-89a2-18bbee45ffa8" containerName="collect-profiles" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.151966 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.157778 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-qwnl7"] Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.211548 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.211913 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.212526 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.313408 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk4sl\" (UniqueName: \"kubernetes.io/projected/c900acae-a1f8-4ba6-9288-b7ef705e0c9e-kube-api-access-rk4sl\") pod \"auto-csr-approver-29566756-qwnl7\" (UID: \"c900acae-a1f8-4ba6-9288-b7ef705e0c9e\") " pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.415636 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk4sl\" (UniqueName: \"kubernetes.io/projected/c900acae-a1f8-4ba6-9288-b7ef705e0c9e-kube-api-access-rk4sl\") pod \"auto-csr-approver-29566756-qwnl7\" (UID: \"c900acae-a1f8-4ba6-9288-b7ef705e0c9e\") " pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.444227 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk4sl\" (UniqueName: \"kubernetes.io/projected/c900acae-a1f8-4ba6-9288-b7ef705e0c9e-kube-api-access-rk4sl\") pod \"auto-csr-approver-29566756-qwnl7\" (UID: \"c900acae-a1f8-4ba6-9288-b7ef705e0c9e\") " pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.535460 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.945280 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-qwnl7"] Mar 20 11:16:00 crc kubenswrapper[4772]: I0320 11:16:00.953806 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:16:01 crc kubenswrapper[4772]: I0320 11:16:01.638056 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" event={"ID":"c900acae-a1f8-4ba6-9288-b7ef705e0c9e","Type":"ContainerStarted","Data":"82920c99cd61302b5009f2afb5554c193ac7389cb4e0dab9d56920f1694f0040"} Mar 20 11:16:02 crc kubenswrapper[4772]: I0320 11:16:02.649663 4772 generic.go:334] "Generic (PLEG): container finished" podID="c900acae-a1f8-4ba6-9288-b7ef705e0c9e" containerID="8f25f50b590f9c8c97f592b36feeff33abf1be5c31dfc659dd5bf5f5d4ec2202" exitCode=0 Mar 20 11:16:02 crc kubenswrapper[4772]: I0320 11:16:02.654473 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" event={"ID":"c900acae-a1f8-4ba6-9288-b7ef705e0c9e","Type":"ContainerDied","Data":"8f25f50b590f9c8c97f592b36feeff33abf1be5c31dfc659dd5bf5f5d4ec2202"} Mar 20 11:16:03 crc kubenswrapper[4772]: I0320 11:16:03.930624 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.076622 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk4sl\" (UniqueName: \"kubernetes.io/projected/c900acae-a1f8-4ba6-9288-b7ef705e0c9e-kube-api-access-rk4sl\") pod \"c900acae-a1f8-4ba6-9288-b7ef705e0c9e\" (UID: \"c900acae-a1f8-4ba6-9288-b7ef705e0c9e\") " Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.082665 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c900acae-a1f8-4ba6-9288-b7ef705e0c9e-kube-api-access-rk4sl" (OuterVolumeSpecName: "kube-api-access-rk4sl") pod "c900acae-a1f8-4ba6-9288-b7ef705e0c9e" (UID: "c900acae-a1f8-4ba6-9288-b7ef705e0c9e"). InnerVolumeSpecName "kube-api-access-rk4sl". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.178501 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk4sl\" (UniqueName: \"kubernetes.io/projected/c900acae-a1f8-4ba6-9288-b7ef705e0c9e-kube-api-access-rk4sl\") on node \"crc\" DevicePath \"\"" Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.668505 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" event={"ID":"c900acae-a1f8-4ba6-9288-b7ef705e0c9e","Type":"ContainerDied","Data":"82920c99cd61302b5009f2afb5554c193ac7389cb4e0dab9d56920f1694f0040"} Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.668554 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566756-qwnl7" Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.668560 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82920c99cd61302b5009f2afb5554c193ac7389cb4e0dab9d56920f1694f0040" Mar 20 11:16:04 crc kubenswrapper[4772]: I0320 11:16:04.997092 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-vw5dq"] Mar 20 11:16:05 crc kubenswrapper[4772]: I0320 11:16:05.002885 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566750-vw5dq"] Mar 20 11:16:06 crc kubenswrapper[4772]: I0320 11:16:06.653125 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8405588b-3a72-4eb6-8223-277ba942d2b1" path="/var/lib/kubelet/pods/8405588b-3a72-4eb6-8223-277ba942d2b1/volumes" Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.564727 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.565282 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.565362 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.566316 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d53ebb3bfe8693516c060197c431334b6d14ac36c11fb07c44a8694176495d3d"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.566397 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://d53ebb3bfe8693516c060197c431334b6d14ac36c11fb07c44a8694176495d3d" gracePeriod=600 Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.707555 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="d53ebb3bfe8693516c060197c431334b6d14ac36c11fb07c44a8694176495d3d" exitCode=0 Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.707632 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"d53ebb3bfe8693516c060197c431334b6d14ac36c11fb07c44a8694176495d3d"} Mar 20 11:16:09 crc kubenswrapper[4772]: I0320 11:16:09.707692 4772 scope.go:117] "RemoveContainer" containerID="3a2dc425dd346ae424a2a128cb64ede7d6abbbfbc7a26799f2508db56e373109" Mar 20 11:16:10 crc kubenswrapper[4772]: I0320 11:16:10.716858 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"7c8714aee25d85b48463c81e409c35750bf47c0d95326d0c06a87712c350ffeb"} Mar 20 11:16:38 crc kubenswrapper[4772]: I0320 11:16:38.537409 4772 scope.go:117] "RemoveContainer" containerID="3d1ab177438067585a87dc1f86da415365ecd5a7c064ac8d0005e88c9d3dd16b" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.143913 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566758-z45gj"] Mar 20 11:18:00 crc kubenswrapper[4772]: E0320 11:18:00.144743 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c900acae-a1f8-4ba6-9288-b7ef705e0c9e" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.144758 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c900acae-a1f8-4ba6-9288-b7ef705e0c9e" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.144960 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c900acae-a1f8-4ba6-9288-b7ef705e0c9e" containerName="oc" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.145428 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.147129 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.147452 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.147656 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.159212 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-z45gj"] Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.287182 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkjr7\" (UniqueName: \"kubernetes.io/projected/7a4736c7-2076-484e-9338-398a38fa7e0e-kube-api-access-fkjr7\") pod \"auto-csr-approver-29566758-z45gj\" (UID: \"7a4736c7-2076-484e-9338-398a38fa7e0e\") " pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.389137 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkjr7\" (UniqueName: \"kubernetes.io/projected/7a4736c7-2076-484e-9338-398a38fa7e0e-kube-api-access-fkjr7\") pod \"auto-csr-approver-29566758-z45gj\" (UID: \"7a4736c7-2076-484e-9338-398a38fa7e0e\") " pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.417022 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkjr7\" (UniqueName: \"kubernetes.io/projected/7a4736c7-2076-484e-9338-398a38fa7e0e-kube-api-access-fkjr7\") pod \"auto-csr-approver-29566758-z45gj\" (UID: \"7a4736c7-2076-484e-9338-398a38fa7e0e\") " pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.469718 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:00 crc kubenswrapper[4772]: I0320 11:18:00.905025 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-z45gj"] Mar 20 11:18:01 crc kubenswrapper[4772]: I0320 11:18:01.528751 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-z45gj" event={"ID":"7a4736c7-2076-484e-9338-398a38fa7e0e","Type":"ContainerStarted","Data":"81812eef6d879972b0cdf52bc92c0cdf09172a05f638e5bad5860d4bea06ecb2"} Mar 20 11:18:02 crc kubenswrapper[4772]: I0320 11:18:02.537176 4772 generic.go:334] "Generic (PLEG): container finished" podID="7a4736c7-2076-484e-9338-398a38fa7e0e" containerID="19e24cb9603d3448be2bcd7eada252e0b28b6d58979e4adbfda02e3bf19fc74b" exitCode=0 Mar 20 11:18:02 crc kubenswrapper[4772]: I0320 11:18:02.537246 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-z45gj" event={"ID":"7a4736c7-2076-484e-9338-398a38fa7e0e","Type":"ContainerDied","Data":"19e24cb9603d3448be2bcd7eada252e0b28b6d58979e4adbfda02e3bf19fc74b"} Mar 20 11:18:03 crc kubenswrapper[4772]: I0320 11:18:03.815530 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:03 crc kubenswrapper[4772]: I0320 11:18:03.939995 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkjr7\" (UniqueName: \"kubernetes.io/projected/7a4736c7-2076-484e-9338-398a38fa7e0e-kube-api-access-fkjr7\") pod \"7a4736c7-2076-484e-9338-398a38fa7e0e\" (UID: \"7a4736c7-2076-484e-9338-398a38fa7e0e\") " Mar 20 11:18:03 crc kubenswrapper[4772]: I0320 11:18:03.947444 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7a4736c7-2076-484e-9338-398a38fa7e0e-kube-api-access-fkjr7" (OuterVolumeSpecName: "kube-api-access-fkjr7") pod "7a4736c7-2076-484e-9338-398a38fa7e0e" (UID: "7a4736c7-2076-484e-9338-398a38fa7e0e"). InnerVolumeSpecName "kube-api-access-fkjr7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:18:04 crc kubenswrapper[4772]: I0320 11:18:04.041775 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkjr7\" (UniqueName: \"kubernetes.io/projected/7a4736c7-2076-484e-9338-398a38fa7e0e-kube-api-access-fkjr7\") on node \"crc\" DevicePath \"\"" Mar 20 11:18:04 crc kubenswrapper[4772]: I0320 11:18:04.551897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566758-z45gj" event={"ID":"7a4736c7-2076-484e-9338-398a38fa7e0e","Type":"ContainerDied","Data":"81812eef6d879972b0cdf52bc92c0cdf09172a05f638e5bad5860d4bea06ecb2"} Mar 20 11:18:04 crc kubenswrapper[4772]: I0320 11:18:04.552167 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="81812eef6d879972b0cdf52bc92c0cdf09172a05f638e5bad5860d4bea06ecb2" Mar 20 11:18:04 crc kubenswrapper[4772]: I0320 11:18:04.552226 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566758-z45gj" Mar 20 11:18:04 crc kubenswrapper[4772]: I0320 11:18:04.893643 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-85knh"] Mar 20 11:18:04 crc kubenswrapper[4772]: I0320 11:18:04.900396 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566752-85knh"] Mar 20 11:18:06 crc kubenswrapper[4772]: I0320 11:18:06.650315 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="51958443-f266-4496-9825-88985bff7a41" path="/var/lib/kubelet/pods/51958443-f266-4496-9825-88985bff7a41/volumes" Mar 20 11:18:09 crc kubenswrapper[4772]: I0320 11:18:09.564698 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:18:09 crc kubenswrapper[4772]: I0320 11:18:09.565170 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:18:38 crc kubenswrapper[4772]: I0320 11:18:38.607372 4772 scope.go:117] "RemoveContainer" containerID="24b9aeccfcd22f0a9771929ce620113e303a493dae8973c020565b63c00aa39f" Mar 20 11:18:39 crc kubenswrapper[4772]: I0320 11:18:39.565175 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:18:39 crc kubenswrapper[4772]: I0320 11:18:39.565558 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.156464 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-vpf7g"] Mar 20 11:18:56 crc kubenswrapper[4772]: E0320 11:18:56.157405 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7a4736c7-2076-484e-9338-398a38fa7e0e" containerName="oc" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.157422 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7a4736c7-2076-484e-9338-398a38fa7e0e" containerName="oc" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.157576 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7a4736c7-2076-484e-9338-398a38fa7e0e" containerName="oc" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.158455 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.172467 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpf7g"] Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.219834 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-utilities\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.219904 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-catalog-content\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.220044 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9npk\" (UniqueName: \"kubernetes.io/projected/716c8794-f08b-4c35-9186-8dd4e5413f28-kube-api-access-j9npk\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.322009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9npk\" (UniqueName: \"kubernetes.io/projected/716c8794-f08b-4c35-9186-8dd4e5413f28-kube-api-access-j9npk\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.322351 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-utilities\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.322452 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-catalog-content\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.322896 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-utilities\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.322925 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-catalog-content\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.350119 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9npk\" (UniqueName: \"kubernetes.io/projected/716c8794-f08b-4c35-9186-8dd4e5413f28-kube-api-access-j9npk\") pod \"redhat-operators-vpf7g\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.478306 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:18:56 crc kubenswrapper[4772]: I0320 11:18:56.922084 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-vpf7g"] Mar 20 11:18:57 crc kubenswrapper[4772]: I0320 11:18:57.921266 4772 generic.go:334] "Generic (PLEG): container finished" podID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerID="6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa" exitCode=0 Mar 20 11:18:57 crc kubenswrapper[4772]: I0320 11:18:57.921401 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpf7g" event={"ID":"716c8794-f08b-4c35-9186-8dd4e5413f28","Type":"ContainerDied","Data":"6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa"} Mar 20 11:18:57 crc kubenswrapper[4772]: I0320 11:18:57.921514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpf7g" event={"ID":"716c8794-f08b-4c35-9186-8dd4e5413f28","Type":"ContainerStarted","Data":"f17403e25778ac4e4b492d3beaef289c729205eba816b9418a7eb9bd2b4a3765"} Mar 20 11:18:59 crc kubenswrapper[4772]: I0320 11:18:59.935175 4772 generic.go:334] "Generic (PLEG): container finished" podID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerID="bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2" exitCode=0 Mar 20 11:18:59 crc kubenswrapper[4772]: I0320 11:18:59.935270 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpf7g" event={"ID":"716c8794-f08b-4c35-9186-8dd4e5413f28","Type":"ContainerDied","Data":"bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2"} Mar 20 11:19:00 crc kubenswrapper[4772]: I0320 11:19:00.947198 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpf7g" event={"ID":"716c8794-f08b-4c35-9186-8dd4e5413f28","Type":"ContainerStarted","Data":"c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93"} Mar 20 11:19:00 crc kubenswrapper[4772]: I0320 11:19:00.965926 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-vpf7g" podStartSLOduration=2.4949589899999998 podStartE2EDuration="4.96590487s" podCreationTimestamp="2026-03-20 11:18:56 +0000 UTC" firstStartedPulling="2026-03-20 11:18:57.923073672 +0000 UTC m=+1424.014040157" lastFinishedPulling="2026-03-20 11:19:00.394019552 +0000 UTC m=+1426.484986037" observedRunningTime="2026-03-20 11:19:00.96444984 +0000 UTC m=+1427.055416325" watchObservedRunningTime="2026-03-20 11:19:00.96590487 +0000 UTC m=+1427.056871355" Mar 20 11:19:06 crc kubenswrapper[4772]: I0320 11:19:06.478418 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:19:06 crc kubenswrapper[4772]: I0320 11:19:06.478926 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:19:06 crc kubenswrapper[4772]: I0320 11:19:06.516724 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:19:07 crc kubenswrapper[4772]: I0320 11:19:07.016939 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:19:07 crc kubenswrapper[4772]: I0320 11:19:07.059131 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpf7g"] Mar 20 11:19:08 crc kubenswrapper[4772]: I0320 11:19:08.996296 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-vpf7g" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="registry-server" containerID="cri-o://c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93" gracePeriod=2 Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.362912 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.414613 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-catalog-content\") pod \"716c8794-f08b-4c35-9186-8dd4e5413f28\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.414666 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-utilities\") pod \"716c8794-f08b-4c35-9186-8dd4e5413f28\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.414724 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9npk\" (UniqueName: \"kubernetes.io/projected/716c8794-f08b-4c35-9186-8dd4e5413f28-kube-api-access-j9npk\") pod \"716c8794-f08b-4c35-9186-8dd4e5413f28\" (UID: \"716c8794-f08b-4c35-9186-8dd4e5413f28\") " Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.415944 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-utilities" (OuterVolumeSpecName: "utilities") pod "716c8794-f08b-4c35-9186-8dd4e5413f28" (UID: "716c8794-f08b-4c35-9186-8dd4e5413f28"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.422027 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/716c8794-f08b-4c35-9186-8dd4e5413f28-kube-api-access-j9npk" (OuterVolumeSpecName: "kube-api-access-j9npk") pod "716c8794-f08b-4c35-9186-8dd4e5413f28" (UID: "716c8794-f08b-4c35-9186-8dd4e5413f28"). InnerVolumeSpecName "kube-api-access-j9npk". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.516179 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.516218 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9npk\" (UniqueName: \"kubernetes.io/projected/716c8794-f08b-4c35-9186-8dd4e5413f28-kube-api-access-j9npk\") on node \"crc\" DevicePath \"\"" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.564234 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.564278 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.564324 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.564504 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "716c8794-f08b-4c35-9186-8dd4e5413f28" (UID: "716c8794-f08b-4c35-9186-8dd4e5413f28"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.565081 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"7c8714aee25d85b48463c81e409c35750bf47c0d95326d0c06a87712c350ffeb"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.565149 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://7c8714aee25d85b48463c81e409c35750bf47c0d95326d0c06a87712c350ffeb" gracePeriod=600 Mar 20 11:19:09 crc kubenswrapper[4772]: I0320 11:19:09.618786 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/716c8794-f08b-4c35-9186-8dd4e5413f28-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.005483 4772 generic.go:334] "Generic (PLEG): container finished" podID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerID="c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93" exitCode=0 Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.005518 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-vpf7g" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.005548 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpf7g" event={"ID":"716c8794-f08b-4c35-9186-8dd4e5413f28","Type":"ContainerDied","Data":"c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93"} Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.005594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-vpf7g" event={"ID":"716c8794-f08b-4c35-9186-8dd4e5413f28","Type":"ContainerDied","Data":"f17403e25778ac4e4b492d3beaef289c729205eba816b9418a7eb9bd2b4a3765"} Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.005617 4772 scope.go:117] "RemoveContainer" containerID="c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.013381 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="7c8714aee25d85b48463c81e409c35750bf47c0d95326d0c06a87712c350ffeb" exitCode=0 Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.013419 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"7c8714aee25d85b48463c81e409c35750bf47c0d95326d0c06a87712c350ffeb"} Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.013470 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a"} Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.032488 4772 scope.go:117] "RemoveContainer" containerID="bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.052667 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-vpf7g"] Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.058579 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-vpf7g"] Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.062819 4772 scope.go:117] "RemoveContainer" containerID="6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.078496 4772 scope.go:117] "RemoveContainer" containerID="c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93" Mar 20 11:19:10 crc kubenswrapper[4772]: E0320 11:19:10.088333 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93\": container with ID starting with c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93 not found: ID does not exist" containerID="c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.088378 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93"} err="failed to get container status \"c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93\": rpc error: code = NotFound desc = could not find container \"c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93\": container with ID starting with c015bb5c5da41bac32305d573d387392bafbf200478d219efbae08161ed3ef93 not found: ID does not exist" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.088406 4772 scope.go:117] "RemoveContainer" containerID="bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2" Mar 20 11:19:10 crc kubenswrapper[4772]: E0320 11:19:10.088972 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2\": container with ID starting with bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2 not found: ID does not exist" containerID="bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.088989 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2"} err="failed to get container status \"bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2\": rpc error: code = NotFound desc = could not find container \"bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2\": container with ID starting with bbe36fa50fa0334820bd6380cba065d94990db93e39b7a0856104d2b346f61f2 not found: ID does not exist" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.089001 4772 scope.go:117] "RemoveContainer" containerID="6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa" Mar 20 11:19:10 crc kubenswrapper[4772]: E0320 11:19:10.089545 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa\": container with ID starting with 6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa not found: ID does not exist" containerID="6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.089576 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa"} err="failed to get container status \"6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa\": rpc error: code = NotFound desc = could not find container \"6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa\": container with ID starting with 6ba0bce48ae33848fa12336c3e0afd2a44dcdaa490904c6d5816cb331d5904aa not found: ID does not exist" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.089592 4772 scope.go:117] "RemoveContainer" containerID="d53ebb3bfe8693516c060197c431334b6d14ac36c11fb07c44a8694176495d3d" Mar 20 11:19:10 crc kubenswrapper[4772]: I0320 11:19:10.652371 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" path="/var/lib/kubelet/pods/716c8794-f08b-4c35-9186-8dd4e5413f28/volumes" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.139186 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566760-bmqrx"] Mar 20 11:20:00 crc kubenswrapper[4772]: E0320 11:20:00.140082 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="extract-content" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.140096 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="extract-content" Mar 20 11:20:00 crc kubenswrapper[4772]: E0320 11:20:00.140112 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="extract-utilities" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.140118 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="extract-utilities" Mar 20 11:20:00 crc kubenswrapper[4772]: E0320 11:20:00.140136 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="registry-server" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.140145 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="registry-server" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.140289 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="716c8794-f08b-4c35-9186-8dd4e5413f28" containerName="registry-server" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.140908 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.142665 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.142965 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.143966 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.147591 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-bmqrx"] Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.249514 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsfd5\" (UniqueName: \"kubernetes.io/projected/ac923150-8ab0-4dd2-9fe7-9007302adef2-kube-api-access-dsfd5\") pod \"auto-csr-approver-29566760-bmqrx\" (UID: \"ac923150-8ab0-4dd2-9fe7-9007302adef2\") " pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.350344 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dsfd5\" (UniqueName: \"kubernetes.io/projected/ac923150-8ab0-4dd2-9fe7-9007302adef2-kube-api-access-dsfd5\") pod \"auto-csr-approver-29566760-bmqrx\" (UID: \"ac923150-8ab0-4dd2-9fe7-9007302adef2\") " pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.370026 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsfd5\" (UniqueName: \"kubernetes.io/projected/ac923150-8ab0-4dd2-9fe7-9007302adef2-kube-api-access-dsfd5\") pod \"auto-csr-approver-29566760-bmqrx\" (UID: \"ac923150-8ab0-4dd2-9fe7-9007302adef2\") " pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.459922 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:00 crc kubenswrapper[4772]: I0320 11:20:00.869904 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-bmqrx"] Mar 20 11:20:01 crc kubenswrapper[4772]: I0320 11:20:01.357374 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" event={"ID":"ac923150-8ab0-4dd2-9fe7-9007302adef2","Type":"ContainerStarted","Data":"ddade1b5be867cae20289fb8603aed4db5fab3e9822ceea72063dceab1be4982"} Mar 20 11:20:03 crc kubenswrapper[4772]: I0320 11:20:03.372823 4772 generic.go:334] "Generic (PLEG): container finished" podID="ac923150-8ab0-4dd2-9fe7-9007302adef2" containerID="c7a2244d64411c0cbefe4eaf6f0f813768f94fa5d9dd79c8b065c437666a6cc7" exitCode=0 Mar 20 11:20:03 crc kubenswrapper[4772]: I0320 11:20:03.372930 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" event={"ID":"ac923150-8ab0-4dd2-9fe7-9007302adef2","Type":"ContainerDied","Data":"c7a2244d64411c0cbefe4eaf6f0f813768f94fa5d9dd79c8b065c437666a6cc7"} Mar 20 11:20:04 crc kubenswrapper[4772]: I0320 11:20:04.663033 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:04 crc kubenswrapper[4772]: I0320 11:20:04.715655 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dsfd5\" (UniqueName: \"kubernetes.io/projected/ac923150-8ab0-4dd2-9fe7-9007302adef2-kube-api-access-dsfd5\") pod \"ac923150-8ab0-4dd2-9fe7-9007302adef2\" (UID: \"ac923150-8ab0-4dd2-9fe7-9007302adef2\") " Mar 20 11:20:04 crc kubenswrapper[4772]: I0320 11:20:04.721714 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac923150-8ab0-4dd2-9fe7-9007302adef2-kube-api-access-dsfd5" (OuterVolumeSpecName: "kube-api-access-dsfd5") pod "ac923150-8ab0-4dd2-9fe7-9007302adef2" (UID: "ac923150-8ab0-4dd2-9fe7-9007302adef2"). InnerVolumeSpecName "kube-api-access-dsfd5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:04 crc kubenswrapper[4772]: I0320 11:20:04.817827 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dsfd5\" (UniqueName: \"kubernetes.io/projected/ac923150-8ab0-4dd2-9fe7-9007302adef2-kube-api-access-dsfd5\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:05 crc kubenswrapper[4772]: I0320 11:20:05.389799 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" event={"ID":"ac923150-8ab0-4dd2-9fe7-9007302adef2","Type":"ContainerDied","Data":"ddade1b5be867cae20289fb8603aed4db5fab3e9822ceea72063dceab1be4982"} Mar 20 11:20:05 crc kubenswrapper[4772]: I0320 11:20:05.390287 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ddade1b5be867cae20289fb8603aed4db5fab3e9822ceea72063dceab1be4982" Mar 20 11:20:05 crc kubenswrapper[4772]: I0320 11:20:05.389910 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566760-bmqrx" Mar 20 11:20:05 crc kubenswrapper[4772]: I0320 11:20:05.737032 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-rhmjv"] Mar 20 11:20:05 crc kubenswrapper[4772]: I0320 11:20:05.743041 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566754-rhmjv"] Mar 20 11:20:06 crc kubenswrapper[4772]: I0320 11:20:06.652910 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="940f6073-4e13-42da-91e7-bf055f8ecd94" path="/var/lib/kubelet/pods/940f6073-4e13-42da-91e7-bf055f8ecd94/volumes" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.422763 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-whj9r"] Mar 20 11:20:19 crc kubenswrapper[4772]: E0320 11:20:19.424125 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac923150-8ab0-4dd2-9fe7-9007302adef2" containerName="oc" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.424144 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac923150-8ab0-4dd2-9fe7-9007302adef2" containerName="oc" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.424436 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac923150-8ab0-4dd2-9fe7-9007302adef2" containerName="oc" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.425877 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.433315 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whj9r"] Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.561391 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-utilities\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.561474 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-catalog-content\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.561547 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xtcwb\" (UniqueName: \"kubernetes.io/projected/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-kube-api-access-xtcwb\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.662915 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-utilities\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.662998 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-catalog-content\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.663048 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xtcwb\" (UniqueName: \"kubernetes.io/projected/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-kube-api-access-xtcwb\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.663545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-utilities\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.664269 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-catalog-content\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.686550 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xtcwb\" (UniqueName: \"kubernetes.io/projected/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-kube-api-access-xtcwb\") pod \"certified-operators-whj9r\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:19 crc kubenswrapper[4772]: I0320 11:20:19.749534 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:20 crc kubenswrapper[4772]: I0320 11:20:20.243290 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-whj9r"] Mar 20 11:20:20 crc kubenswrapper[4772]: I0320 11:20:20.501450 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerID="9c7fddaa422ce5563d8a3160794fb3fc1ca18d705dc2aa7a37c819d22740e7bf" exitCode=0 Mar 20 11:20:20 crc kubenswrapper[4772]: I0320 11:20:20.501516 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whj9r" event={"ID":"b6c05da9-6e9c-49a7-af20-9a4f8eef1852","Type":"ContainerDied","Data":"9c7fddaa422ce5563d8a3160794fb3fc1ca18d705dc2aa7a37c819d22740e7bf"} Mar 20 11:20:20 crc kubenswrapper[4772]: I0320 11:20:20.501773 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whj9r" event={"ID":"b6c05da9-6e9c-49a7-af20-9a4f8eef1852","Type":"ContainerStarted","Data":"31d357366d387c475c48a42589b6d6970f47f42edb90487f293db0d5b07b7f0e"} Mar 20 11:20:22 crc kubenswrapper[4772]: I0320 11:20:22.517798 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerID="d8a8343f137593a78c9b3b44b02b0836e7b537184243501228a9c40ba73b1bb1" exitCode=0 Mar 20 11:20:22 crc kubenswrapper[4772]: I0320 11:20:22.517883 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whj9r" event={"ID":"b6c05da9-6e9c-49a7-af20-9a4f8eef1852","Type":"ContainerDied","Data":"d8a8343f137593a78c9b3b44b02b0836e7b537184243501228a9c40ba73b1bb1"} Mar 20 11:20:24 crc kubenswrapper[4772]: I0320 11:20:24.538245 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whj9r" event={"ID":"b6c05da9-6e9c-49a7-af20-9a4f8eef1852","Type":"ContainerStarted","Data":"ece08f754290c5b936d56cebffd43ff627a9641d4be976b9d9fed1613e994d8b"} Mar 20 11:20:24 crc kubenswrapper[4772]: I0320 11:20:24.557374 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-whj9r" podStartSLOduration=2.5203655510000003 podStartE2EDuration="5.557356202s" podCreationTimestamp="2026-03-20 11:20:19 +0000 UTC" firstStartedPulling="2026-03-20 11:20:20.503302648 +0000 UTC m=+1506.594269133" lastFinishedPulling="2026-03-20 11:20:23.540293299 +0000 UTC m=+1509.631259784" observedRunningTime="2026-03-20 11:20:24.556330285 +0000 UTC m=+1510.647296770" watchObservedRunningTime="2026-03-20 11:20:24.557356202 +0000 UTC m=+1510.648322687" Mar 20 11:20:29 crc kubenswrapper[4772]: I0320 11:20:29.750554 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:29 crc kubenswrapper[4772]: I0320 11:20:29.750940 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:29 crc kubenswrapper[4772]: I0320 11:20:29.790463 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:30 crc kubenswrapper[4772]: I0320 11:20:30.624562 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:30 crc kubenswrapper[4772]: I0320 11:20:30.680285 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whj9r"] Mar 20 11:20:32 crc kubenswrapper[4772]: I0320 11:20:32.589607 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-whj9r" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="registry-server" containerID="cri-o://ece08f754290c5b936d56cebffd43ff627a9641d4be976b9d9fed1613e994d8b" gracePeriod=2 Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.606184 4772 generic.go:334] "Generic (PLEG): container finished" podID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerID="ece08f754290c5b936d56cebffd43ff627a9641d4be976b9d9fed1613e994d8b" exitCode=0 Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.606297 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whj9r" event={"ID":"b6c05da9-6e9c-49a7-af20-9a4f8eef1852","Type":"ContainerDied","Data":"ece08f754290c5b936d56cebffd43ff627a9641d4be976b9d9fed1613e994d8b"} Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.709821 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.864785 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-catalog-content\") pod \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.864919 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xtcwb\" (UniqueName: \"kubernetes.io/projected/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-kube-api-access-xtcwb\") pod \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.865044 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-utilities\") pod \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\" (UID: \"b6c05da9-6e9c-49a7-af20-9a4f8eef1852\") " Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.866008 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-utilities" (OuterVolumeSpecName: "utilities") pod "b6c05da9-6e9c-49a7-af20-9a4f8eef1852" (UID: "b6c05da9-6e9c-49a7-af20-9a4f8eef1852"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.871003 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-kube-api-access-xtcwb" (OuterVolumeSpecName: "kube-api-access-xtcwb") pod "b6c05da9-6e9c-49a7-af20-9a4f8eef1852" (UID: "b6c05da9-6e9c-49a7-af20-9a4f8eef1852"). InnerVolumeSpecName "kube-api-access-xtcwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.912507 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b6c05da9-6e9c-49a7-af20-9a4f8eef1852" (UID: "b6c05da9-6e9c-49a7-af20-9a4f8eef1852"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.966552 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.966598 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:33 crc kubenswrapper[4772]: I0320 11:20:33.966612 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xtcwb\" (UniqueName: \"kubernetes.io/projected/b6c05da9-6e9c-49a7-af20-9a4f8eef1852-kube-api-access-xtcwb\") on node \"crc\" DevicePath \"\"" Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.617301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-whj9r" event={"ID":"b6c05da9-6e9c-49a7-af20-9a4f8eef1852","Type":"ContainerDied","Data":"31d357366d387c475c48a42589b6d6970f47f42edb90487f293db0d5b07b7f0e"} Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.617379 4772 scope.go:117] "RemoveContainer" containerID="ece08f754290c5b936d56cebffd43ff627a9641d4be976b9d9fed1613e994d8b" Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.617464 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-whj9r" Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.638203 4772 scope.go:117] "RemoveContainer" containerID="d8a8343f137593a78c9b3b44b02b0836e7b537184243501228a9c40ba73b1bb1" Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.662956 4772 scope.go:117] "RemoveContainer" containerID="9c7fddaa422ce5563d8a3160794fb3fc1ca18d705dc2aa7a37c819d22740e7bf" Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.675686 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-whj9r"] Mar 20 11:20:34 crc kubenswrapper[4772]: I0320 11:20:34.685104 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-whj9r"] Mar 20 11:20:36 crc kubenswrapper[4772]: I0320 11:20:36.652183 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" path="/var/lib/kubelet/pods/b6c05da9-6e9c-49a7-af20-9a4f8eef1852/volumes" Mar 20 11:20:38 crc kubenswrapper[4772]: I0320 11:20:38.681914 4772 scope.go:117] "RemoveContainer" containerID="36051500080ff917f5010cf28907ec2b464c24ebe5dd94320668dd7b033d1771" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.408192 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-7fqv7"] Mar 20 11:21:00 crc kubenswrapper[4772]: E0320 11:21:00.410107 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="extract-utilities" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.410136 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="extract-utilities" Mar 20 11:21:00 crc kubenswrapper[4772]: E0320 11:21:00.410161 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="extract-content" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.410170 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="extract-content" Mar 20 11:21:00 crc kubenswrapper[4772]: E0320 11:21:00.410187 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="registry-server" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.410195 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="registry-server" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.410418 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b6c05da9-6e9c-49a7-af20-9a4f8eef1852" containerName="registry-server" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.412230 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.421682 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fqv7"] Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.462477 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-djnpb\" (UniqueName: \"kubernetes.io/projected/33660497-fea8-485a-a5ce-aebc8c8fe8a8-kube-api-access-djnpb\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.463021 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33660497-fea8-485a-a5ce-aebc8c8fe8a8-utilities\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.463096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33660497-fea8-485a-a5ce-aebc8c8fe8a8-catalog-content\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.565083 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-djnpb\" (UniqueName: \"kubernetes.io/projected/33660497-fea8-485a-a5ce-aebc8c8fe8a8-kube-api-access-djnpb\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.565136 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33660497-fea8-485a-a5ce-aebc8c8fe8a8-utilities\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.565173 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33660497-fea8-485a-a5ce-aebc8c8fe8a8-catalog-content\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.565643 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/33660497-fea8-485a-a5ce-aebc8c8fe8a8-catalog-content\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.566144 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/33660497-fea8-485a-a5ce-aebc8c8fe8a8-utilities\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.586105 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-djnpb\" (UniqueName: \"kubernetes.io/projected/33660497-fea8-485a-a5ce-aebc8c8fe8a8-kube-api-access-djnpb\") pod \"community-operators-7fqv7\" (UID: \"33660497-fea8-485a-a5ce-aebc8c8fe8a8\") " pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:00 crc kubenswrapper[4772]: I0320 11:21:00.772720 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:01 crc kubenswrapper[4772]: I0320 11:21:01.067499 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fqv7"] Mar 20 11:21:01 crc kubenswrapper[4772]: I0320 11:21:01.405534 4772 generic.go:334] "Generic (PLEG): container finished" podID="33660497-fea8-485a-a5ce-aebc8c8fe8a8" containerID="9d957645d72a1f42a90b5630fbc19209b61e296bb2e7dd253413a2954ca48df9" exitCode=0 Mar 20 11:21:01 crc kubenswrapper[4772]: I0320 11:21:01.405577 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fqv7" event={"ID":"33660497-fea8-485a-a5ce-aebc8c8fe8a8","Type":"ContainerDied","Data":"9d957645d72a1f42a90b5630fbc19209b61e296bb2e7dd253413a2954ca48df9"} Mar 20 11:21:01 crc kubenswrapper[4772]: I0320 11:21:01.405601 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fqv7" event={"ID":"33660497-fea8-485a-a5ce-aebc8c8fe8a8","Type":"ContainerStarted","Data":"d307fd5912bba0be2a2cdf0950c4970b05e680feee3801849cb89f06878eeb89"} Mar 20 11:21:01 crc kubenswrapper[4772]: I0320 11:21:01.408162 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:21:05 crc kubenswrapper[4772]: E0320 11:21:05.882201 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33660497_fea8_485a_a5ce_aebc8c8fe8a8.slice/crio-e32c795ac39f4dc80e17525eb4e444545f126913738704db03e6af8a04cf6ee8.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33660497_fea8_485a_a5ce_aebc8c8fe8a8.slice/crio-conmon-e32c795ac39f4dc80e17525eb4e444545f126913738704db03e6af8a04cf6ee8.scope\": RecentStats: unable to find data in memory cache]" Mar 20 11:21:06 crc kubenswrapper[4772]: I0320 11:21:06.444500 4772 generic.go:334] "Generic (PLEG): container finished" podID="33660497-fea8-485a-a5ce-aebc8c8fe8a8" containerID="e32c795ac39f4dc80e17525eb4e444545f126913738704db03e6af8a04cf6ee8" exitCode=0 Mar 20 11:21:06 crc kubenswrapper[4772]: I0320 11:21:06.444845 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fqv7" event={"ID":"33660497-fea8-485a-a5ce-aebc8c8fe8a8","Type":"ContainerDied","Data":"e32c795ac39f4dc80e17525eb4e444545f126913738704db03e6af8a04cf6ee8"} Mar 20 11:21:08 crc kubenswrapper[4772]: I0320 11:21:08.464328 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-7fqv7" event={"ID":"33660497-fea8-485a-a5ce-aebc8c8fe8a8","Type":"ContainerStarted","Data":"b61fdbdfed764b88af09d71b300b5c7e1e6d3238ae380eedebd735335f5459c7"} Mar 20 11:21:08 crc kubenswrapper[4772]: I0320 11:21:08.497216 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-7fqv7" podStartSLOduration=2.598998318 podStartE2EDuration="8.497197685s" podCreationTimestamp="2026-03-20 11:21:00 +0000 UTC" firstStartedPulling="2026-03-20 11:21:01.407957006 +0000 UTC m=+1547.498923491" lastFinishedPulling="2026-03-20 11:21:07.306156373 +0000 UTC m=+1553.397122858" observedRunningTime="2026-03-20 11:21:08.490980956 +0000 UTC m=+1554.581947461" watchObservedRunningTime="2026-03-20 11:21:08.497197685 +0000 UTC m=+1554.588164170" Mar 20 11:21:09 crc kubenswrapper[4772]: I0320 11:21:09.565080 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:21:09 crc kubenswrapper[4772]: I0320 11:21:09.565149 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:21:10 crc kubenswrapper[4772]: I0320 11:21:10.773248 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:10 crc kubenswrapper[4772]: I0320 11:21:10.773636 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:10 crc kubenswrapper[4772]: I0320 11:21:10.821944 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:20 crc kubenswrapper[4772]: I0320 11:21:20.822057 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-7fqv7" Mar 20 11:21:20 crc kubenswrapper[4772]: I0320 11:21:20.893444 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-7fqv7"] Mar 20 11:21:20 crc kubenswrapper[4772]: I0320 11:21:20.939714 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l6262"] Mar 20 11:21:20 crc kubenswrapper[4772]: I0320 11:21:20.940268 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-l6262" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="registry-server" containerID="cri-o://d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5" gracePeriod=2 Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.338112 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.386219 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-catalog-content\") pod \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.386315 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-utilities\") pod \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.386347 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd42h\" (UniqueName: \"kubernetes.io/projected/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-kube-api-access-sd42h\") pod \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\" (UID: \"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510\") " Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.387229 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-utilities" (OuterVolumeSpecName: "utilities") pod "3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" (UID: "3ac21cd0-22f1-4a4f-9ac1-06a867cb7510"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.394386 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-kube-api-access-sd42h" (OuterVolumeSpecName: "kube-api-access-sd42h") pod "3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" (UID: "3ac21cd0-22f1-4a4f-9ac1-06a867cb7510"). InnerVolumeSpecName "kube-api-access-sd42h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.451074 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" (UID: "3ac21cd0-22f1-4a4f-9ac1-06a867cb7510"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.487274 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.487310 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.487320 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sd42h\" (UniqueName: \"kubernetes.io/projected/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510-kube-api-access-sd42h\") on node \"crc\" DevicePath \"\"" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.559337 4772 generic.go:334] "Generic (PLEG): container finished" podID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerID="d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5" exitCode=0 Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.560101 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-l6262" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.560983 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerDied","Data":"d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5"} Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.561036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-l6262" event={"ID":"3ac21cd0-22f1-4a4f-9ac1-06a867cb7510","Type":"ContainerDied","Data":"f764be5139eada7d786c76f5a5741ffe9d46bb32c5cdca5cb209975d5ff9e087"} Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.561079 4772 scope.go:117] "RemoveContainer" containerID="d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.581052 4772 scope.go:117] "RemoveContainer" containerID="e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.601375 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-l6262"] Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.608611 4772 scope.go:117] "RemoveContainer" containerID="b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.613464 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-l6262"] Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.629591 4772 scope.go:117] "RemoveContainer" containerID="d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5" Mar 20 11:21:21 crc kubenswrapper[4772]: E0320 11:21:21.630023 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5\": container with ID starting with d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5 not found: ID does not exist" containerID="d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.630148 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5"} err="failed to get container status \"d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5\": rpc error: code = NotFound desc = could not find container \"d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5\": container with ID starting with d00f34c77c90137134a0927cb9e48b8d2b166062839b7e08935db10ab31672b5 not found: ID does not exist" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.630247 4772 scope.go:117] "RemoveContainer" containerID="e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26" Mar 20 11:21:21 crc kubenswrapper[4772]: E0320 11:21:21.630764 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26\": container with ID starting with e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26 not found: ID does not exist" containerID="e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.630787 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26"} err="failed to get container status \"e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26\": rpc error: code = NotFound desc = could not find container \"e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26\": container with ID starting with e6d9b2783d938cdb808c22acad685e70cca1090f1c9a7cc0e63f097279e88d26 not found: ID does not exist" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.630808 4772 scope.go:117] "RemoveContainer" containerID="b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367" Mar 20 11:21:21 crc kubenswrapper[4772]: E0320 11:21:21.631220 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367\": container with ID starting with b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367 not found: ID does not exist" containerID="b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367" Mar 20 11:21:21 crc kubenswrapper[4772]: I0320 11:21:21.631336 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367"} err="failed to get container status \"b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367\": rpc error: code = NotFound desc = could not find container \"b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367\": container with ID starting with b9fe89e9820c683dfbb1c6e302bcb9f39e16a8da99a739d036a5a9b88ac5b367 not found: ID does not exist" Mar 20 11:21:22 crc kubenswrapper[4772]: I0320 11:21:22.651933 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" path="/var/lib/kubelet/pods/3ac21cd0-22f1-4a4f-9ac1-06a867cb7510/volumes" Mar 20 11:21:39 crc kubenswrapper[4772]: I0320 11:21:39.966990 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:21:39 crc kubenswrapper[4772]: I0320 11:21:39.967558 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.137405 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566762-bfkz4"] Mar 20 11:22:00 crc kubenswrapper[4772]: E0320 11:22:00.138333 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.138351 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="extract-utilities" Mar 20 11:22:00 crc kubenswrapper[4772]: E0320 11:22:00.138371 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.138379 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="extract-content" Mar 20 11:22:00 crc kubenswrapper[4772]: E0320 11:22:00.138398 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.138406 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.138579 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="3ac21cd0-22f1-4a4f-9ac1-06a867cb7510" containerName="registry-server" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.139139 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.141295 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.141624 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.141889 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.144654 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-bfkz4"] Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.327401 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jpw68\" (UniqueName: \"kubernetes.io/projected/668f5b39-b45e-41fc-83a6-a92c6dc32a41-kube-api-access-jpw68\") pod \"auto-csr-approver-29566762-bfkz4\" (UID: \"668f5b39-b45e-41fc-83a6-a92c6dc32a41\") " pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.428907 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jpw68\" (UniqueName: \"kubernetes.io/projected/668f5b39-b45e-41fc-83a6-a92c6dc32a41-kube-api-access-jpw68\") pod \"auto-csr-approver-29566762-bfkz4\" (UID: \"668f5b39-b45e-41fc-83a6-a92c6dc32a41\") " pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.448949 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jpw68\" (UniqueName: \"kubernetes.io/projected/668f5b39-b45e-41fc-83a6-a92c6dc32a41-kube-api-access-jpw68\") pod \"auto-csr-approver-29566762-bfkz4\" (UID: \"668f5b39-b45e-41fc-83a6-a92c6dc32a41\") " pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.510502 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:00 crc kubenswrapper[4772]: I0320 11:22:00.908535 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-bfkz4"] Mar 20 11:22:01 crc kubenswrapper[4772]: I0320 11:22:01.145555 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" event={"ID":"668f5b39-b45e-41fc-83a6-a92c6dc32a41","Type":"ContainerStarted","Data":"6c7ada8b7f1a4df344250498af07ee4cc63f0ca87548c569eb831b0ea752a72e"} Mar 20 11:22:03 crc kubenswrapper[4772]: I0320 11:22:03.159335 4772 generic.go:334] "Generic (PLEG): container finished" podID="668f5b39-b45e-41fc-83a6-a92c6dc32a41" containerID="586cb1897a21024b28250ac448d83cc3387ff8c7e49cfd47106088974f01a196" exitCode=0 Mar 20 11:22:03 crc kubenswrapper[4772]: I0320 11:22:03.159384 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" event={"ID":"668f5b39-b45e-41fc-83a6-a92c6dc32a41","Type":"ContainerDied","Data":"586cb1897a21024b28250ac448d83cc3387ff8c7e49cfd47106088974f01a196"} Mar 20 11:22:04 crc kubenswrapper[4772]: I0320 11:22:04.430518 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:04 crc kubenswrapper[4772]: I0320 11:22:04.586965 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jpw68\" (UniqueName: \"kubernetes.io/projected/668f5b39-b45e-41fc-83a6-a92c6dc32a41-kube-api-access-jpw68\") pod \"668f5b39-b45e-41fc-83a6-a92c6dc32a41\" (UID: \"668f5b39-b45e-41fc-83a6-a92c6dc32a41\") " Mar 20 11:22:04 crc kubenswrapper[4772]: I0320 11:22:04.592588 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/668f5b39-b45e-41fc-83a6-a92c6dc32a41-kube-api-access-jpw68" (OuterVolumeSpecName: "kube-api-access-jpw68") pod "668f5b39-b45e-41fc-83a6-a92c6dc32a41" (UID: "668f5b39-b45e-41fc-83a6-a92c6dc32a41"). InnerVolumeSpecName "kube-api-access-jpw68". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:04 crc kubenswrapper[4772]: I0320 11:22:04.689133 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jpw68\" (UniqueName: \"kubernetes.io/projected/668f5b39-b45e-41fc-83a6-a92c6dc32a41-kube-api-access-jpw68\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:05 crc kubenswrapper[4772]: I0320 11:22:05.174958 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" event={"ID":"668f5b39-b45e-41fc-83a6-a92c6dc32a41","Type":"ContainerDied","Data":"6c7ada8b7f1a4df344250498af07ee4cc63f0ca87548c569eb831b0ea752a72e"} Mar 20 11:22:05 crc kubenswrapper[4772]: I0320 11:22:05.175019 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c7ada8b7f1a4df344250498af07ee4cc63f0ca87548c569eb831b0ea752a72e" Mar 20 11:22:05 crc kubenswrapper[4772]: I0320 11:22:05.174994 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566762-bfkz4" Mar 20 11:22:05 crc kubenswrapper[4772]: I0320 11:22:05.492762 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-qwnl7"] Mar 20 11:22:05 crc kubenswrapper[4772]: I0320 11:22:05.498116 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566756-qwnl7"] Mar 20 11:22:06 crc kubenswrapper[4772]: I0320 11:22:06.652728 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c900acae-a1f8-4ba6-9288-b7ef705e0c9e" path="/var/lib/kubelet/pods/c900acae-a1f8-4ba6-9288-b7ef705e0c9e/volumes" Mar 20 11:22:09 crc kubenswrapper[4772]: I0320 11:22:09.565012 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:22:09 crc kubenswrapper[4772]: I0320 11:22:09.565367 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:22:09 crc kubenswrapper[4772]: I0320 11:22:09.565417 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:22:09 crc kubenswrapper[4772]: I0320 11:22:09.566019 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:22:09 crc kubenswrapper[4772]: I0320 11:22:09.566103 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" gracePeriod=600 Mar 20 11:22:09 crc kubenswrapper[4772]: E0320 11:22:09.709957 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:22:10 crc kubenswrapper[4772]: I0320 11:22:10.214686 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" exitCode=0 Mar 20 11:22:10 crc kubenswrapper[4772]: I0320 11:22:10.214721 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a"} Mar 20 11:22:10 crc kubenswrapper[4772]: I0320 11:22:10.215088 4772 scope.go:117] "RemoveContainer" containerID="7c8714aee25d85b48463c81e409c35750bf47c0d95326d0c06a87712c350ffeb" Mar 20 11:22:10 crc kubenswrapper[4772]: I0320 11:22:10.215572 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:22:10 crc kubenswrapper[4772]: E0320 11:22:10.215905 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:22:20 crc kubenswrapper[4772]: I0320 11:22:20.641668 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:22:20 crc kubenswrapper[4772]: E0320 11:22:20.643763 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.714823 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-vd8rt"] Mar 20 11:22:29 crc kubenswrapper[4772]: E0320 11:22:29.715890 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="668f5b39-b45e-41fc-83a6-a92c6dc32a41" containerName="oc" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.715910 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="668f5b39-b45e-41fc-83a6-a92c6dc32a41" containerName="oc" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.716171 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="668f5b39-b45e-41fc-83a6-a92c6dc32a41" containerName="oc" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.717770 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.724599 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd8rt"] Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.828279 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-catalog-content\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.828763 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4md4f\" (UniqueName: \"kubernetes.io/projected/49b00b7e-48ff-4853-93dd-ff86ed572e69-kube-api-access-4md4f\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.828969 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-utilities\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.929702 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-utilities\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.929776 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-catalog-content\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.929798 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4md4f\" (UniqueName: \"kubernetes.io/projected/49b00b7e-48ff-4853-93dd-ff86ed572e69-kube-api-access-4md4f\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.930476 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-catalog-content\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.930501 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-utilities\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:29 crc kubenswrapper[4772]: I0320 11:22:29.956351 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4md4f\" (UniqueName: \"kubernetes.io/projected/49b00b7e-48ff-4853-93dd-ff86ed572e69-kube-api-access-4md4f\") pod \"redhat-marketplace-vd8rt\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:30 crc kubenswrapper[4772]: I0320 11:22:30.042356 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:30 crc kubenswrapper[4772]: I0320 11:22:30.457532 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd8rt"] Mar 20 11:22:31 crc kubenswrapper[4772]: I0320 11:22:31.359341 4772 generic.go:334] "Generic (PLEG): container finished" podID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerID="967c83de47271ed89c5d0e7a5fdaee1e3e2e2504e85bb06d01a8733d216539bf" exitCode=0 Mar 20 11:22:31 crc kubenswrapper[4772]: I0320 11:22:31.359388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd8rt" event={"ID":"49b00b7e-48ff-4853-93dd-ff86ed572e69","Type":"ContainerDied","Data":"967c83de47271ed89c5d0e7a5fdaee1e3e2e2504e85bb06d01a8733d216539bf"} Mar 20 11:22:31 crc kubenswrapper[4772]: I0320 11:22:31.359613 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd8rt" event={"ID":"49b00b7e-48ff-4853-93dd-ff86ed572e69","Type":"ContainerStarted","Data":"aa2d473ea2339d5aeb35d51ed0e2b5a70bd3b7b863ee555084329b2f96949bbe"} Mar 20 11:22:33 crc kubenswrapper[4772]: I0320 11:22:33.376028 4772 generic.go:334] "Generic (PLEG): container finished" podID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerID="976ae9b2d100148b381cf4740993ca3c945137365661f3547174241398837376" exitCode=0 Mar 20 11:22:33 crc kubenswrapper[4772]: I0320 11:22:33.376136 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd8rt" event={"ID":"49b00b7e-48ff-4853-93dd-ff86ed572e69","Type":"ContainerDied","Data":"976ae9b2d100148b381cf4740993ca3c945137365661f3547174241398837376"} Mar 20 11:22:33 crc kubenswrapper[4772]: I0320 11:22:33.643461 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:22:33 crc kubenswrapper[4772]: E0320 11:22:33.644030 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:22:36 crc kubenswrapper[4772]: I0320 11:22:36.405348 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd8rt" event={"ID":"49b00b7e-48ff-4853-93dd-ff86ed572e69","Type":"ContainerStarted","Data":"8a65f788c5b7ad4a3a6c254a3374b2dada45c0569a3fc1fda08351fa057574bd"} Mar 20 11:22:36 crc kubenswrapper[4772]: I0320 11:22:36.427526 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-vd8rt" podStartSLOduration=3.415305178 podStartE2EDuration="7.427503509s" podCreationTimestamp="2026-03-20 11:22:29 +0000 UTC" firstStartedPulling="2026-03-20 11:22:31.360908609 +0000 UTC m=+1637.451875094" lastFinishedPulling="2026-03-20 11:22:35.37310692 +0000 UTC m=+1641.464073425" observedRunningTime="2026-03-20 11:22:36.422803671 +0000 UTC m=+1642.513770166" watchObservedRunningTime="2026-03-20 11:22:36.427503509 +0000 UTC m=+1642.518469994" Mar 20 11:22:38 crc kubenswrapper[4772]: I0320 11:22:38.800529 4772 scope.go:117] "RemoveContainer" containerID="8f25f50b590f9c8c97f592b36feeff33abf1be5c31dfc659dd5bf5f5d4ec2202" Mar 20 11:22:40 crc kubenswrapper[4772]: I0320 11:22:40.043611 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:40 crc kubenswrapper[4772]: I0320 11:22:40.044013 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:40 crc kubenswrapper[4772]: I0320 11:22:40.113210 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:40 crc kubenswrapper[4772]: I0320 11:22:40.494253 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:40 crc kubenswrapper[4772]: I0320 11:22:40.549438 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd8rt"] Mar 20 11:22:42 crc kubenswrapper[4772]: I0320 11:22:42.452624 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-vd8rt" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="registry-server" containerID="cri-o://8a65f788c5b7ad4a3a6c254a3374b2dada45c0569a3fc1fda08351fa057574bd" gracePeriod=2 Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.459770 4772 generic.go:334] "Generic (PLEG): container finished" podID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerID="8a65f788c5b7ad4a3a6c254a3374b2dada45c0569a3fc1fda08351fa057574bd" exitCode=0 Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.459860 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd8rt" event={"ID":"49b00b7e-48ff-4853-93dd-ff86ed572e69","Type":"ContainerDied","Data":"8a65f788c5b7ad4a3a6c254a3374b2dada45c0569a3fc1fda08351fa057574bd"} Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.460135 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-vd8rt" event={"ID":"49b00b7e-48ff-4853-93dd-ff86ed572e69","Type":"ContainerDied","Data":"aa2d473ea2339d5aeb35d51ed0e2b5a70bd3b7b863ee555084329b2f96949bbe"} Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.460147 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="aa2d473ea2339d5aeb35d51ed0e2b5a70bd3b7b863ee555084329b2f96949bbe" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.462045 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.616516 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-catalog-content\") pod \"49b00b7e-48ff-4853-93dd-ff86ed572e69\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.616689 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4md4f\" (UniqueName: \"kubernetes.io/projected/49b00b7e-48ff-4853-93dd-ff86ed572e69-kube-api-access-4md4f\") pod \"49b00b7e-48ff-4853-93dd-ff86ed572e69\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.616935 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-utilities\") pod \"49b00b7e-48ff-4853-93dd-ff86ed572e69\" (UID: \"49b00b7e-48ff-4853-93dd-ff86ed572e69\") " Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.617939 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-utilities" (OuterVolumeSpecName: "utilities") pod "49b00b7e-48ff-4853-93dd-ff86ed572e69" (UID: "49b00b7e-48ff-4853-93dd-ff86ed572e69"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.623084 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49b00b7e-48ff-4853-93dd-ff86ed572e69-kube-api-access-4md4f" (OuterVolumeSpecName: "kube-api-access-4md4f") pod "49b00b7e-48ff-4853-93dd-ff86ed572e69" (UID: "49b00b7e-48ff-4853-93dd-ff86ed572e69"). InnerVolumeSpecName "kube-api-access-4md4f". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.655211 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "49b00b7e-48ff-4853-93dd-ff86ed572e69" (UID: "49b00b7e-48ff-4853-93dd-ff86ed572e69"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.718428 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.718507 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/49b00b7e-48ff-4853-93dd-ff86ed572e69-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:43 crc kubenswrapper[4772]: I0320 11:22:43.718570 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4md4f\" (UniqueName: \"kubernetes.io/projected/49b00b7e-48ff-4853-93dd-ff86ed572e69-kube-api-access-4md4f\") on node \"crc\" DevicePath \"\"" Mar 20 11:22:44 crc kubenswrapper[4772]: I0320 11:22:44.466419 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-vd8rt" Mar 20 11:22:44 crc kubenswrapper[4772]: I0320 11:22:44.492051 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd8rt"] Mar 20 11:22:44 crc kubenswrapper[4772]: I0320 11:22:44.508479 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-vd8rt"] Mar 20 11:22:44 crc kubenswrapper[4772]: I0320 11:22:44.646098 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:22:44 crc kubenswrapper[4772]: E0320 11:22:44.646339 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:22:44 crc kubenswrapper[4772]: I0320 11:22:44.651385 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" path="/var/lib/kubelet/pods/49b00b7e-48ff-4853-93dd-ff86ed572e69/volumes" Mar 20 11:22:56 crc kubenswrapper[4772]: I0320 11:22:56.642331 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:22:56 crc kubenswrapper[4772]: E0320 11:22:56.645459 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:23:07 crc kubenswrapper[4772]: I0320 11:23:07.642833 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:23:07 crc kubenswrapper[4772]: E0320 11:23:07.644115 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:23:19 crc kubenswrapper[4772]: I0320 11:23:19.641724 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:23:19 crc kubenswrapper[4772]: E0320 11:23:19.643528 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:23:33 crc kubenswrapper[4772]: I0320 11:23:33.641572 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:23:33 crc kubenswrapper[4772]: E0320 11:23:33.642628 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:23:44 crc kubenswrapper[4772]: I0320 11:23:44.649155 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:23:44 crc kubenswrapper[4772]: E0320 11:23:44.650451 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:23:56 crc kubenswrapper[4772]: I0320 11:23:56.641806 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:23:56 crc kubenswrapper[4772]: E0320 11:23:56.642466 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.155259 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566764-p5c9w"] Mar 20 11:24:00 crc kubenswrapper[4772]: E0320 11:24:00.155783 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.155794 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4772]: E0320 11:24:00.155817 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.155823 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="extract-content" Mar 20 11:24:00 crc kubenswrapper[4772]: E0320 11:24:00.155856 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.155866 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="extract-utilities" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.156016 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="49b00b7e-48ff-4853-93dd-ff86ed572e69" containerName="registry-server" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.156428 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.159017 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.159616 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.159724 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.183289 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-p5c9w"] Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.316676 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pv9l\" (UniqueName: \"kubernetes.io/projected/82c4e609-de65-4673-ba94-624e450f8548-kube-api-access-2pv9l\") pod \"auto-csr-approver-29566764-p5c9w\" (UID: \"82c4e609-de65-4673-ba94-624e450f8548\") " pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.419007 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2pv9l\" (UniqueName: \"kubernetes.io/projected/82c4e609-de65-4673-ba94-624e450f8548-kube-api-access-2pv9l\") pod \"auto-csr-approver-29566764-p5c9w\" (UID: \"82c4e609-de65-4673-ba94-624e450f8548\") " pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.436935 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pv9l\" (UniqueName: \"kubernetes.io/projected/82c4e609-de65-4673-ba94-624e450f8548-kube-api-access-2pv9l\") pod \"auto-csr-approver-29566764-p5c9w\" (UID: \"82c4e609-de65-4673-ba94-624e450f8548\") " pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.485301 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:00 crc kubenswrapper[4772]: I0320 11:24:00.910723 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-p5c9w"] Mar 20 11:24:01 crc kubenswrapper[4772]: I0320 11:24:01.216075 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" event={"ID":"82c4e609-de65-4673-ba94-624e450f8548","Type":"ContainerStarted","Data":"d23239a6bfab04c0abfc43ef5b871db1e9c834548cf31f078c6b21f8c5bb3973"} Mar 20 11:24:03 crc kubenswrapper[4772]: I0320 11:24:03.235404 4772 generic.go:334] "Generic (PLEG): container finished" podID="82c4e609-de65-4673-ba94-624e450f8548" containerID="e5085d30c4ba0c11e78692eae42d2540d7cbfe7678bf9ac1c35465a1a5e27bf1" exitCode=0 Mar 20 11:24:03 crc kubenswrapper[4772]: I0320 11:24:03.235468 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" event={"ID":"82c4e609-de65-4673-ba94-624e450f8548","Type":"ContainerDied","Data":"e5085d30c4ba0c11e78692eae42d2540d7cbfe7678bf9ac1c35465a1a5e27bf1"} Mar 20 11:24:04 crc kubenswrapper[4772]: I0320 11:24:04.571453 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:04 crc kubenswrapper[4772]: I0320 11:24:04.708769 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2pv9l\" (UniqueName: \"kubernetes.io/projected/82c4e609-de65-4673-ba94-624e450f8548-kube-api-access-2pv9l\") pod \"82c4e609-de65-4673-ba94-624e450f8548\" (UID: \"82c4e609-de65-4673-ba94-624e450f8548\") " Mar 20 11:24:04 crc kubenswrapper[4772]: I0320 11:24:04.725368 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82c4e609-de65-4673-ba94-624e450f8548-kube-api-access-2pv9l" (OuterVolumeSpecName: "kube-api-access-2pv9l") pod "82c4e609-de65-4673-ba94-624e450f8548" (UID: "82c4e609-de65-4673-ba94-624e450f8548"). InnerVolumeSpecName "kube-api-access-2pv9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:24:04 crc kubenswrapper[4772]: I0320 11:24:04.817632 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2pv9l\" (UniqueName: \"kubernetes.io/projected/82c4e609-de65-4673-ba94-624e450f8548-kube-api-access-2pv9l\") on node \"crc\" DevicePath \"\"" Mar 20 11:24:05 crc kubenswrapper[4772]: I0320 11:24:05.254124 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" event={"ID":"82c4e609-de65-4673-ba94-624e450f8548","Type":"ContainerDied","Data":"d23239a6bfab04c0abfc43ef5b871db1e9c834548cf31f078c6b21f8c5bb3973"} Mar 20 11:24:05 crc kubenswrapper[4772]: I0320 11:24:05.254163 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d23239a6bfab04c0abfc43ef5b871db1e9c834548cf31f078c6b21f8c5bb3973" Mar 20 11:24:05 crc kubenswrapper[4772]: I0320 11:24:05.254486 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566764-p5c9w" Mar 20 11:24:05 crc kubenswrapper[4772]: I0320 11:24:05.638196 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-z45gj"] Mar 20 11:24:05 crc kubenswrapper[4772]: I0320 11:24:05.642864 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566758-z45gj"] Mar 20 11:24:06 crc kubenswrapper[4772]: I0320 11:24:06.652418 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7a4736c7-2076-484e-9338-398a38fa7e0e" path="/var/lib/kubelet/pods/7a4736c7-2076-484e-9338-398a38fa7e0e/volumes" Mar 20 11:24:08 crc kubenswrapper[4772]: I0320 11:24:08.643731 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:24:08 crc kubenswrapper[4772]: E0320 11:24:08.644810 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:24:21 crc kubenswrapper[4772]: I0320 11:24:21.641646 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:24:21 crc kubenswrapper[4772]: E0320 11:24:21.642286 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:24:33 crc kubenswrapper[4772]: I0320 11:24:33.641762 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:24:33 crc kubenswrapper[4772]: E0320 11:24:33.642812 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:24:38 crc kubenswrapper[4772]: I0320 11:24:38.883690 4772 scope.go:117] "RemoveContainer" containerID="19e24cb9603d3448be2bcd7eada252e0b28b6d58979e4adbfda02e3bf19fc74b" Mar 20 11:24:48 crc kubenswrapper[4772]: I0320 11:24:48.642658 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:24:48 crc kubenswrapper[4772]: E0320 11:24:48.643521 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:25:03 crc kubenswrapper[4772]: I0320 11:25:03.643320 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:25:03 crc kubenswrapper[4772]: E0320 11:25:03.644559 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:25:14 crc kubenswrapper[4772]: I0320 11:25:14.645508 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:25:14 crc kubenswrapper[4772]: E0320 11:25:14.646327 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:25:29 crc kubenswrapper[4772]: I0320 11:25:29.642897 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:25:29 crc kubenswrapper[4772]: E0320 11:25:29.644061 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:25:41 crc kubenswrapper[4772]: I0320 11:25:41.642338 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:25:41 crc kubenswrapper[4772]: E0320 11:25:41.644124 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:25:54 crc kubenswrapper[4772]: I0320 11:25:54.650521 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:25:54 crc kubenswrapper[4772]: E0320 11:25:54.651489 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.133050 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lh5zc"] Mar 20 11:26:00 crc kubenswrapper[4772]: E0320 11:26:00.133646 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82c4e609-de65-4673-ba94-624e450f8548" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.133661 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="82c4e609-de65-4673-ba94-624e450f8548" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.133802 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="82c4e609-de65-4673-ba94-624e450f8548" containerName="oc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.134374 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.141068 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.141292 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.141459 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.142392 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lh5zc"] Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.168608 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v696k\" (UniqueName: \"kubernetes.io/projected/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533-kube-api-access-v696k\") pod \"auto-csr-approver-29566766-lh5zc\" (UID: \"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533\") " pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.269718 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v696k\" (UniqueName: \"kubernetes.io/projected/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533-kube-api-access-v696k\") pod \"auto-csr-approver-29566766-lh5zc\" (UID: \"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533\") " pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.291245 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v696k\" (UniqueName: \"kubernetes.io/projected/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533-kube-api-access-v696k\") pod \"auto-csr-approver-29566766-lh5zc\" (UID: \"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533\") " pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.455502 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:00 crc kubenswrapper[4772]: I0320 11:26:00.892135 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lh5zc"] Mar 20 11:26:01 crc kubenswrapper[4772]: I0320 11:26:01.246616 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" event={"ID":"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533","Type":"ContainerStarted","Data":"e4994c9fb6b043fa75e2cb9d4845250de32f072603f315d57f396c438334d8d8"} Mar 20 11:26:03 crc kubenswrapper[4772]: I0320 11:26:03.260538 4772 generic.go:334] "Generic (PLEG): container finished" podID="5d9984e5-88ba-4310-9ff4-6dfd2e5c4533" containerID="1618b2250445d384bbeec852ae4b2085c7bea901effba1e8ef6cc1d84393a0f4" exitCode=0 Mar 20 11:26:03 crc kubenswrapper[4772]: I0320 11:26:03.260595 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" event={"ID":"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533","Type":"ContainerDied","Data":"1618b2250445d384bbeec852ae4b2085c7bea901effba1e8ef6cc1d84393a0f4"} Mar 20 11:26:04 crc kubenswrapper[4772]: I0320 11:26:04.579305 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:04 crc kubenswrapper[4772]: I0320 11:26:04.730881 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v696k\" (UniqueName: \"kubernetes.io/projected/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533-kube-api-access-v696k\") pod \"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533\" (UID: \"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533\") " Mar 20 11:26:04 crc kubenswrapper[4772]: I0320 11:26:04.737003 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533-kube-api-access-v696k" (OuterVolumeSpecName: "kube-api-access-v696k") pod "5d9984e5-88ba-4310-9ff4-6dfd2e5c4533" (UID: "5d9984e5-88ba-4310-9ff4-6dfd2e5c4533"). InnerVolumeSpecName "kube-api-access-v696k". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:26:04 crc kubenswrapper[4772]: I0320 11:26:04.833324 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v696k\" (UniqueName: \"kubernetes.io/projected/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533-kube-api-access-v696k\") on node \"crc\" DevicePath \"\"" Mar 20 11:26:05 crc kubenswrapper[4772]: I0320 11:26:05.276444 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" event={"ID":"5d9984e5-88ba-4310-9ff4-6dfd2e5c4533","Type":"ContainerDied","Data":"e4994c9fb6b043fa75e2cb9d4845250de32f072603f315d57f396c438334d8d8"} Mar 20 11:26:05 crc kubenswrapper[4772]: I0320 11:26:05.276489 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e4994c9fb6b043fa75e2cb9d4845250de32f072603f315d57f396c438334d8d8" Mar 20 11:26:05 crc kubenswrapper[4772]: I0320 11:26:05.277026 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566766-lh5zc" Mar 20 11:26:05 crc kubenswrapper[4772]: I0320 11:26:05.641014 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-bmqrx"] Mar 20 11:26:05 crc kubenswrapper[4772]: I0320 11:26:05.650301 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566760-bmqrx"] Mar 20 11:26:06 crc kubenswrapper[4772]: I0320 11:26:06.659028 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac923150-8ab0-4dd2-9fe7-9007302adef2" path="/var/lib/kubelet/pods/ac923150-8ab0-4dd2-9fe7-9007302adef2/volumes" Mar 20 11:26:09 crc kubenswrapper[4772]: I0320 11:26:09.642282 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:26:09 crc kubenswrapper[4772]: E0320 11:26:09.643163 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:26:23 crc kubenswrapper[4772]: I0320 11:26:23.642625 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:26:23 crc kubenswrapper[4772]: E0320 11:26:23.643769 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:26:34 crc kubenswrapper[4772]: I0320 11:26:34.656524 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:26:34 crc kubenswrapper[4772]: E0320 11:26:34.657729 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:26:38 crc kubenswrapper[4772]: I0320 11:26:38.969356 4772 scope.go:117] "RemoveContainer" containerID="c7a2244d64411c0cbefe4eaf6f0f813768f94fa5d9dd79c8b065c437666a6cc7" Mar 20 11:26:46 crc kubenswrapper[4772]: I0320 11:26:46.642331 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:26:46 crc kubenswrapper[4772]: E0320 11:26:46.642944 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:26:57 crc kubenswrapper[4772]: I0320 11:26:57.643142 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:26:57 crc kubenswrapper[4772]: E0320 11:26:57.644171 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:27:08 crc kubenswrapper[4772]: I0320 11:27:08.642163 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:27:08 crc kubenswrapper[4772]: E0320 11:27:08.642926 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:27:21 crc kubenswrapper[4772]: I0320 11:27:21.641809 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:27:21 crc kubenswrapper[4772]: I0320 11:27:21.865205 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"012704aade0ad5a7f65fd738d00c4c74639b01c5992da616119c8257b71293ae"} Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.152286 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566768-pzg2n"] Mar 20 11:28:00 crc kubenswrapper[4772]: E0320 11:28:00.153522 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5d9984e5-88ba-4310-9ff4-6dfd2e5c4533" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.153553 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5d9984e5-88ba-4310-9ff4-6dfd2e5c4533" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.153937 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5d9984e5-88ba-4310-9ff4-6dfd2e5c4533" containerName="oc" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.154827 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.157603 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.157749 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.157763 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.159518 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-pzg2n"] Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.351552 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v2cq\" (UniqueName: \"kubernetes.io/projected/b88b4d3f-8e53-48d4-a044-5f4d87650794-kube-api-access-6v2cq\") pod \"auto-csr-approver-29566768-pzg2n\" (UID: \"b88b4d3f-8e53-48d4-a044-5f4d87650794\") " pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.453454 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6v2cq\" (UniqueName: \"kubernetes.io/projected/b88b4d3f-8e53-48d4-a044-5f4d87650794-kube-api-access-6v2cq\") pod \"auto-csr-approver-29566768-pzg2n\" (UID: \"b88b4d3f-8e53-48d4-a044-5f4d87650794\") " pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.475041 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6v2cq\" (UniqueName: \"kubernetes.io/projected/b88b4d3f-8e53-48d4-a044-5f4d87650794-kube-api-access-6v2cq\") pod \"auto-csr-approver-29566768-pzg2n\" (UID: \"b88b4d3f-8e53-48d4-a044-5f4d87650794\") " pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.478719 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.912364 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-pzg2n"] Mar 20 11:28:00 crc kubenswrapper[4772]: I0320 11:28:00.923613 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:28:01 crc kubenswrapper[4772]: I0320 11:28:01.151029 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" event={"ID":"b88b4d3f-8e53-48d4-a044-5f4d87650794","Type":"ContainerStarted","Data":"4cd4cf2df8e6bdc70c1e1b3cfc74d0751121b91c2f40cfbb7a1dbe6551336dda"} Mar 20 11:28:03 crc kubenswrapper[4772]: I0320 11:28:03.167177 4772 generic.go:334] "Generic (PLEG): container finished" podID="b88b4d3f-8e53-48d4-a044-5f4d87650794" containerID="92234d9f503d0f2c5a33766a44c27144b02a161a260fa7110e1f095c0299300e" exitCode=0 Mar 20 11:28:03 crc kubenswrapper[4772]: I0320 11:28:03.167237 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" event={"ID":"b88b4d3f-8e53-48d4-a044-5f4d87650794","Type":"ContainerDied","Data":"92234d9f503d0f2c5a33766a44c27144b02a161a260fa7110e1f095c0299300e"} Mar 20 11:28:04 crc kubenswrapper[4772]: I0320 11:28:04.551023 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:04 crc kubenswrapper[4772]: I0320 11:28:04.711287 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6v2cq\" (UniqueName: \"kubernetes.io/projected/b88b4d3f-8e53-48d4-a044-5f4d87650794-kube-api-access-6v2cq\") pod \"b88b4d3f-8e53-48d4-a044-5f4d87650794\" (UID: \"b88b4d3f-8e53-48d4-a044-5f4d87650794\") " Mar 20 11:28:04 crc kubenswrapper[4772]: I0320 11:28:04.716605 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b88b4d3f-8e53-48d4-a044-5f4d87650794-kube-api-access-6v2cq" (OuterVolumeSpecName: "kube-api-access-6v2cq") pod "b88b4d3f-8e53-48d4-a044-5f4d87650794" (UID: "b88b4d3f-8e53-48d4-a044-5f4d87650794"). InnerVolumeSpecName "kube-api-access-6v2cq". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:28:04 crc kubenswrapper[4772]: I0320 11:28:04.813132 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6v2cq\" (UniqueName: \"kubernetes.io/projected/b88b4d3f-8e53-48d4-a044-5f4d87650794-kube-api-access-6v2cq\") on node \"crc\" DevicePath \"\"" Mar 20 11:28:05 crc kubenswrapper[4772]: I0320 11:28:05.188802 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" event={"ID":"b88b4d3f-8e53-48d4-a044-5f4d87650794","Type":"ContainerDied","Data":"4cd4cf2df8e6bdc70c1e1b3cfc74d0751121b91c2f40cfbb7a1dbe6551336dda"} Mar 20 11:28:05 crc kubenswrapper[4772]: I0320 11:28:05.188858 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4cd4cf2df8e6bdc70c1e1b3cfc74d0751121b91c2f40cfbb7a1dbe6551336dda" Mar 20 11:28:05 crc kubenswrapper[4772]: I0320 11:28:05.188919 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566768-pzg2n" Mar 20 11:28:05 crc kubenswrapper[4772]: I0320 11:28:05.636671 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-bfkz4"] Mar 20 11:28:05 crc kubenswrapper[4772]: I0320 11:28:05.642922 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566762-bfkz4"] Mar 20 11:28:06 crc kubenswrapper[4772]: I0320 11:28:06.650420 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="668f5b39-b45e-41fc-83a6-a92c6dc32a41" path="/var/lib/kubelet/pods/668f5b39-b45e-41fc-83a6-a92c6dc32a41/volumes" Mar 20 11:28:39 crc kubenswrapper[4772]: I0320 11:28:39.055683 4772 scope.go:117] "RemoveContainer" containerID="976ae9b2d100148b381cf4740993ca3c945137365661f3547174241398837376" Mar 20 11:28:39 crc kubenswrapper[4772]: I0320 11:28:39.346623 4772 scope.go:117] "RemoveContainer" containerID="8a65f788c5b7ad4a3a6c254a3374b2dada45c0569a3fc1fda08351fa057574bd" Mar 20 11:28:39 crc kubenswrapper[4772]: I0320 11:28:39.371509 4772 scope.go:117] "RemoveContainer" containerID="586cb1897a21024b28250ac448d83cc3387ff8c7e49cfd47106088974f01a196" Mar 20 11:28:39 crc kubenswrapper[4772]: I0320 11:28:39.407731 4772 scope.go:117] "RemoveContainer" containerID="967c83de47271ed89c5d0e7a5fdaee1e3e2e2504e85bb06d01a8733d216539bf" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.046178 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-djt76"] Mar 20 11:29:30 crc kubenswrapper[4772]: E0320 11:29:30.047736 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b88b4d3f-8e53-48d4-a044-5f4d87650794" containerName="oc" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.047762 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b88b4d3f-8e53-48d4-a044-5f4d87650794" containerName="oc" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.048552 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b88b4d3f-8e53-48d4-a044-5f4d87650794" containerName="oc" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.050225 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.064865 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djt76"] Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.201006 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwszf\" (UniqueName: \"kubernetes.io/projected/7061ad52-2a14-483d-bb1e-385ac5842a5e-kube-api-access-rwszf\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.201072 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-utilities\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.201112 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-catalog-content\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.302285 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwszf\" (UniqueName: \"kubernetes.io/projected/7061ad52-2a14-483d-bb1e-385ac5842a5e-kube-api-access-rwszf\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.302342 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-utilities\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.302369 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-catalog-content\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.302819 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-catalog-content\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.303008 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-utilities\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.324752 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwszf\" (UniqueName: \"kubernetes.io/projected/7061ad52-2a14-483d-bb1e-385ac5842a5e-kube-api-access-rwszf\") pod \"redhat-operators-djt76\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.376922 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.630160 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-djt76"] Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.857637 4772 generic.go:334] "Generic (PLEG): container finished" podID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerID="d0076a2026b40b6d08de417e90d0beec7f7bf7cd06fb0836fa2529b8dac45483" exitCode=0 Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.857808 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerDied","Data":"d0076a2026b40b6d08de417e90d0beec7f7bf7cd06fb0836fa2529b8dac45483"} Mar 20 11:29:30 crc kubenswrapper[4772]: I0320 11:29:30.858024 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerStarted","Data":"5cd88ae47b553a98a7c5023653150be05cf4713ae51a31ab4137884bfef90d79"} Mar 20 11:29:31 crc kubenswrapper[4772]: I0320 11:29:31.866048 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerStarted","Data":"2b5b1dae180487bc79f779d7b4591f362b0f26968b77476fbeca3aff675905c0"} Mar 20 11:29:32 crc kubenswrapper[4772]: I0320 11:29:32.879412 4772 generic.go:334] "Generic (PLEG): container finished" podID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerID="2b5b1dae180487bc79f779d7b4591f362b0f26968b77476fbeca3aff675905c0" exitCode=0 Mar 20 11:29:32 crc kubenswrapper[4772]: I0320 11:29:32.879468 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerDied","Data":"2b5b1dae180487bc79f779d7b4591f362b0f26968b77476fbeca3aff675905c0"} Mar 20 11:29:33 crc kubenswrapper[4772]: I0320 11:29:33.889233 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerStarted","Data":"2206d15cda7afe4db59eed5f9d87235b3f3e7df3772fbd444d450f75bc28b6f4"} Mar 20 11:29:33 crc kubenswrapper[4772]: I0320 11:29:33.908321 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-djt76" podStartSLOduration=1.419713089 podStartE2EDuration="3.90830082s" podCreationTimestamp="2026-03-20 11:29:30 +0000 UTC" firstStartedPulling="2026-03-20 11:29:30.859179449 +0000 UTC m=+2056.950145924" lastFinishedPulling="2026-03-20 11:29:33.34776717 +0000 UTC m=+2059.438733655" observedRunningTime="2026-03-20 11:29:33.904006672 +0000 UTC m=+2059.994973157" watchObservedRunningTime="2026-03-20 11:29:33.90830082 +0000 UTC m=+2059.999267305" Mar 20 11:29:39 crc kubenswrapper[4772]: I0320 11:29:39.878311 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:29:39 crc kubenswrapper[4772]: I0320 11:29:39.878587 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:29:40 crc kubenswrapper[4772]: I0320 11:29:40.377259 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:40 crc kubenswrapper[4772]: I0320 11:29:40.377546 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:40 crc kubenswrapper[4772]: I0320 11:29:40.417734 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:40 crc kubenswrapper[4772]: I0320 11:29:40.987331 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:41 crc kubenswrapper[4772]: I0320 11:29:41.038685 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djt76"] Mar 20 11:29:42 crc kubenswrapper[4772]: I0320 11:29:42.956328 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-djt76" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="registry-server" containerID="cri-o://2206d15cda7afe4db59eed5f9d87235b3f3e7df3772fbd444d450f75bc28b6f4" gracePeriod=2 Mar 20 11:29:43 crc kubenswrapper[4772]: I0320 11:29:43.968279 4772 generic.go:334] "Generic (PLEG): container finished" podID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerID="2206d15cda7afe4db59eed5f9d87235b3f3e7df3772fbd444d450f75bc28b6f4" exitCode=0 Mar 20 11:29:43 crc kubenswrapper[4772]: I0320 11:29:43.968324 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerDied","Data":"2206d15cda7afe4db59eed5f9d87235b3f3e7df3772fbd444d450f75bc28b6f4"} Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.108458 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.199992 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-utilities\") pod \"7061ad52-2a14-483d-bb1e-385ac5842a5e\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.200036 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwszf\" (UniqueName: \"kubernetes.io/projected/7061ad52-2a14-483d-bb1e-385ac5842a5e-kube-api-access-rwszf\") pod \"7061ad52-2a14-483d-bb1e-385ac5842a5e\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.200185 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-catalog-content\") pod \"7061ad52-2a14-483d-bb1e-385ac5842a5e\" (UID: \"7061ad52-2a14-483d-bb1e-385ac5842a5e\") " Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.200831 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-utilities" (OuterVolumeSpecName: "utilities") pod "7061ad52-2a14-483d-bb1e-385ac5842a5e" (UID: "7061ad52-2a14-483d-bb1e-385ac5842a5e"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.205609 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7061ad52-2a14-483d-bb1e-385ac5842a5e-kube-api-access-rwszf" (OuterVolumeSpecName: "kube-api-access-rwszf") pod "7061ad52-2a14-483d-bb1e-385ac5842a5e" (UID: "7061ad52-2a14-483d-bb1e-385ac5842a5e"). InnerVolumeSpecName "kube-api-access-rwszf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.301764 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.301803 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwszf\" (UniqueName: \"kubernetes.io/projected/7061ad52-2a14-483d-bb1e-385ac5842a5e-kube-api-access-rwszf\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.344731 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7061ad52-2a14-483d-bb1e-385ac5842a5e" (UID: "7061ad52-2a14-483d-bb1e-385ac5842a5e"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.403223 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7061ad52-2a14-483d-bb1e-385ac5842a5e-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.976940 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-djt76" event={"ID":"7061ad52-2a14-483d-bb1e-385ac5842a5e","Type":"ContainerDied","Data":"5cd88ae47b553a98a7c5023653150be05cf4713ae51a31ab4137884bfef90d79"} Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.976983 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-djt76" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.977305 4772 scope.go:117] "RemoveContainer" containerID="2206d15cda7afe4db59eed5f9d87235b3f3e7df3772fbd444d450f75bc28b6f4" Mar 20 11:29:44 crc kubenswrapper[4772]: I0320 11:29:44.999061 4772 scope.go:117] "RemoveContainer" containerID="2b5b1dae180487bc79f779d7b4591f362b0f26968b77476fbeca3aff675905c0" Mar 20 11:29:45 crc kubenswrapper[4772]: I0320 11:29:45.003117 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-djt76"] Mar 20 11:29:45 crc kubenswrapper[4772]: I0320 11:29:45.010411 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-djt76"] Mar 20 11:29:45 crc kubenswrapper[4772]: I0320 11:29:45.019821 4772 scope.go:117] "RemoveContainer" containerID="d0076a2026b40b6d08de417e90d0beec7f7bf7cd06fb0836fa2529b8dac45483" Mar 20 11:29:46 crc kubenswrapper[4772]: I0320 11:29:46.664084 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" path="/var/lib/kubelet/pods/7061ad52-2a14-483d-bb1e-385ac5842a5e/volumes" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.167952 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gxx2d"] Mar 20 11:30:00 crc kubenswrapper[4772]: E0320 11:30:00.168894 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.168913 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="extract-utilities" Mar 20 11:30:00 crc kubenswrapper[4772]: E0320 11:30:00.168942 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.168951 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4772]: E0320 11:30:00.168977 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.168986 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="extract-content" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.169149 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="7061ad52-2a14-483d-bb1e-385ac5842a5e" containerName="registry-server" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.169876 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.171478 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.172329 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.172330 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.175501 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf"] Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.176659 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.177980 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.181029 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gxx2d"] Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.181501 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.186595 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf"] Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.224584 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4l4vx\" (UniqueName: \"kubernetes.io/projected/8077500d-1fed-4875-b8ba-eef3c71b5516-kube-api-access-4l4vx\") pod \"auto-csr-approver-29566770-gxx2d\" (UID: \"8077500d-1fed-4875-b8ba-eef3c71b5516\") " pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.224634 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hpbgh\" (UniqueName: \"kubernetes.io/projected/e8b81ad0-f227-453a-9837-c67d46a4c9c3-kube-api-access-hpbgh\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.224655 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8b81ad0-f227-453a-9837-c67d46a4c9c3-secret-volume\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.224683 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8b81ad0-f227-453a-9837-c67d46a4c9c3-config-volume\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.327651 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4l4vx\" (UniqueName: \"kubernetes.io/projected/8077500d-1fed-4875-b8ba-eef3c71b5516-kube-api-access-4l4vx\") pod \"auto-csr-approver-29566770-gxx2d\" (UID: \"8077500d-1fed-4875-b8ba-eef3c71b5516\") " pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.327819 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hpbgh\" (UniqueName: \"kubernetes.io/projected/e8b81ad0-f227-453a-9837-c67d46a4c9c3-kube-api-access-hpbgh\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.327921 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8b81ad0-f227-453a-9837-c67d46a4c9c3-secret-volume\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.328009 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8b81ad0-f227-453a-9837-c67d46a4c9c3-config-volume\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.332150 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8b81ad0-f227-453a-9837-c67d46a4c9c3-config-volume\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.338827 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8b81ad0-f227-453a-9837-c67d46a4c9c3-secret-volume\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.349725 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4l4vx\" (UniqueName: \"kubernetes.io/projected/8077500d-1fed-4875-b8ba-eef3c71b5516-kube-api-access-4l4vx\") pod \"auto-csr-approver-29566770-gxx2d\" (UID: \"8077500d-1fed-4875-b8ba-eef3c71b5516\") " pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.356599 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hpbgh\" (UniqueName: \"kubernetes.io/projected/e8b81ad0-f227-453a-9837-c67d46a4c9c3-kube-api-access-hpbgh\") pod \"collect-profiles-29566770-fdjcf\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.490416 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.506770 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.913704 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gxx2d"] Mar 20 11:30:00 crc kubenswrapper[4772]: I0320 11:30:00.981581 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf"] Mar 20 11:30:01 crc kubenswrapper[4772]: I0320 11:30:01.094068 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" event={"ID":"e8b81ad0-f227-453a-9837-c67d46a4c9c3","Type":"ContainerStarted","Data":"bdca99e27131d162e17ccea9ff59b2ed441f1c2ca4048bd8fc3eb3a073884e00"} Mar 20 11:30:01 crc kubenswrapper[4772]: I0320 11:30:01.095275 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" event={"ID":"8077500d-1fed-4875-b8ba-eef3c71b5516","Type":"ContainerStarted","Data":"12c3dc2999ab9f25c38d43ff083388b3201ef98f35bce239e791d534f0cb7123"} Mar 20 11:30:02 crc kubenswrapper[4772]: I0320 11:30:02.102113 4772 generic.go:334] "Generic (PLEG): container finished" podID="e8b81ad0-f227-453a-9837-c67d46a4c9c3" containerID="cd57aba6084fd9c0b6db4c8dd8774afca9933ac6c02b0523456f88b888ca556f" exitCode=0 Mar 20 11:30:02 crc kubenswrapper[4772]: I0320 11:30:02.102225 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" event={"ID":"e8b81ad0-f227-453a-9837-c67d46a4c9c3","Type":"ContainerDied","Data":"cd57aba6084fd9c0b6db4c8dd8774afca9933ac6c02b0523456f88b888ca556f"} Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.113205 4772 generic.go:334] "Generic (PLEG): container finished" podID="8077500d-1fed-4875-b8ba-eef3c71b5516" containerID="cc33e3c5f46b6c4f651ee45545f44cf72280e989e57c75640edf629b3c4a2625" exitCode=0 Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.113293 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" event={"ID":"8077500d-1fed-4875-b8ba-eef3c71b5516","Type":"ContainerDied","Data":"cc33e3c5f46b6c4f651ee45545f44cf72280e989e57c75640edf629b3c4a2625"} Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.372751 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.478886 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8b81ad0-f227-453a-9837-c67d46a4c9c3-config-volume\") pod \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.478983 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hpbgh\" (UniqueName: \"kubernetes.io/projected/e8b81ad0-f227-453a-9837-c67d46a4c9c3-kube-api-access-hpbgh\") pod \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.479089 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8b81ad0-f227-453a-9837-c67d46a4c9c3-secret-volume\") pod \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\" (UID: \"e8b81ad0-f227-453a-9837-c67d46a4c9c3\") " Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.481092 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e8b81ad0-f227-453a-9837-c67d46a4c9c3-config-volume" (OuterVolumeSpecName: "config-volume") pod "e8b81ad0-f227-453a-9837-c67d46a4c9c3" (UID: "e8b81ad0-f227-453a-9837-c67d46a4c9c3"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.494199 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8b81ad0-f227-453a-9837-c67d46a4c9c3-kube-api-access-hpbgh" (OuterVolumeSpecName: "kube-api-access-hpbgh") pod "e8b81ad0-f227-453a-9837-c67d46a4c9c3" (UID: "e8b81ad0-f227-453a-9837-c67d46a4c9c3"). InnerVolumeSpecName "kube-api-access-hpbgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.494366 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8b81ad0-f227-453a-9837-c67d46a4c9c3-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "e8b81ad0-f227-453a-9837-c67d46a4c9c3" (UID: "e8b81ad0-f227-453a-9837-c67d46a4c9c3"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.581795 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e8b81ad0-f227-453a-9837-c67d46a4c9c3-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.581907 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hpbgh\" (UniqueName: \"kubernetes.io/projected/e8b81ad0-f227-453a-9837-c67d46a4c9c3-kube-api-access-hpbgh\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:03 crc kubenswrapper[4772]: I0320 11:30:03.581929 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/e8b81ad0-f227-453a-9837-c67d46a4c9c3-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.125687 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.125719 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566770-fdjcf" event={"ID":"e8b81ad0-f227-453a-9837-c67d46a4c9c3","Type":"ContainerDied","Data":"bdca99e27131d162e17ccea9ff59b2ed441f1c2ca4048bd8fc3eb3a073884e00"} Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.125783 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdca99e27131d162e17ccea9ff59b2ed441f1c2ca4048bd8fc3eb3a073884e00" Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.385697 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.446702 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg"] Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.455464 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566725-gdkpg"] Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.498231 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4l4vx\" (UniqueName: \"kubernetes.io/projected/8077500d-1fed-4875-b8ba-eef3c71b5516-kube-api-access-4l4vx\") pod \"8077500d-1fed-4875-b8ba-eef3c71b5516\" (UID: \"8077500d-1fed-4875-b8ba-eef3c71b5516\") " Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.503357 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8077500d-1fed-4875-b8ba-eef3c71b5516-kube-api-access-4l4vx" (OuterVolumeSpecName: "kube-api-access-4l4vx") pod "8077500d-1fed-4875-b8ba-eef3c71b5516" (UID: "8077500d-1fed-4875-b8ba-eef3c71b5516"). InnerVolumeSpecName "kube-api-access-4l4vx". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.600024 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4l4vx\" (UniqueName: \"kubernetes.io/projected/8077500d-1fed-4875-b8ba-eef3c71b5516-kube-api-access-4l4vx\") on node \"crc\" DevicePath \"\"" Mar 20 11:30:04 crc kubenswrapper[4772]: I0320 11:30:04.656189 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06381439-6997-45aa-8dce-62b012b0ac68" path="/var/lib/kubelet/pods/06381439-6997-45aa-8dce-62b012b0ac68/volumes" Mar 20 11:30:05 crc kubenswrapper[4772]: I0320 11:30:05.134336 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" event={"ID":"8077500d-1fed-4875-b8ba-eef3c71b5516","Type":"ContainerDied","Data":"12c3dc2999ab9f25c38d43ff083388b3201ef98f35bce239e791d534f0cb7123"} Mar 20 11:30:05 crc kubenswrapper[4772]: I0320 11:30:05.134415 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="12c3dc2999ab9f25c38d43ff083388b3201ef98f35bce239e791d534f0cb7123" Mar 20 11:30:05 crc kubenswrapper[4772]: I0320 11:30:05.134516 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566770-gxx2d" Mar 20 11:30:05 crc kubenswrapper[4772]: I0320 11:30:05.457506 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-p5c9w"] Mar 20 11:30:05 crc kubenswrapper[4772]: I0320 11:30:05.462053 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566764-p5c9w"] Mar 20 11:30:06 crc kubenswrapper[4772]: I0320 11:30:06.653290 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82c4e609-de65-4673-ba94-624e450f8548" path="/var/lib/kubelet/pods/82c4e609-de65-4673-ba94-624e450f8548/volumes" Mar 20 11:30:09 crc kubenswrapper[4772]: I0320 11:30:09.565340 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:30:09 crc kubenswrapper[4772]: I0320 11:30:09.565484 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.564707 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.565485 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.565556 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.566712 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"012704aade0ad5a7f65fd738d00c4c74639b01c5992da616119c8257b71293ae"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.566780 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://012704aade0ad5a7f65fd738d00c4c74639b01c5992da616119c8257b71293ae" gracePeriod=600 Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.896179 4772 scope.go:117] "RemoveContainer" containerID="e53c63cafd33fbfa4e94f437ac55a29be8c86aee585aa730bb52cab188476104" Mar 20 11:30:39 crc kubenswrapper[4772]: I0320 11:30:39.915318 4772 scope.go:117] "RemoveContainer" containerID="e5085d30c4ba0c11e78692eae42d2540d7cbfe7678bf9ac1c35465a1a5e27bf1" Mar 20 11:30:40 crc kubenswrapper[4772]: I0320 11:30:40.410800 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="012704aade0ad5a7f65fd738d00c4c74639b01c5992da616119c8257b71293ae" exitCode=0 Mar 20 11:30:40 crc kubenswrapper[4772]: I0320 11:30:40.410891 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"012704aade0ad5a7f65fd738d00c4c74639b01c5992da616119c8257b71293ae"} Mar 20 11:30:40 crc kubenswrapper[4772]: I0320 11:30:40.411301 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af"} Mar 20 11:30:40 crc kubenswrapper[4772]: I0320 11:30:40.411340 4772 scope.go:117] "RemoveContainer" containerID="9adc03e4c8cae8c190b4aff684c2a53d0d0df8bf4d7169686be544be92b3e69a" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.810996 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-mnzts"] Mar 20 11:31:30 crc kubenswrapper[4772]: E0320 11:31:30.812040 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8b81ad0-f227-453a-9837-c67d46a4c9c3" containerName="collect-profiles" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.812058 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8b81ad0-f227-453a-9837-c67d46a4c9c3" containerName="collect-profiles" Mar 20 11:31:30 crc kubenswrapper[4772]: E0320 11:31:30.812078 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8077500d-1fed-4875-b8ba-eef3c71b5516" containerName="oc" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.812091 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="8077500d-1fed-4875-b8ba-eef3c71b5516" containerName="oc" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.812313 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="8077500d-1fed-4875-b8ba-eef3c71b5516" containerName="oc" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.812346 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8b81ad0-f227-453a-9837-c67d46a4c9c3" containerName="collect-profiles" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.813890 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.843426 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnzts"] Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.913441 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ggct\" (UniqueName: \"kubernetes.io/projected/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-kube-api-access-8ggct\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.913565 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-utilities\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:30 crc kubenswrapper[4772]: I0320 11:31:30.913606 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-catalog-content\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.015334 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ggct\" (UniqueName: \"kubernetes.io/projected/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-kube-api-access-8ggct\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.015400 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-utilities\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.015426 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-catalog-content\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.015995 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-catalog-content\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.016157 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-utilities\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.039220 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ggct\" (UniqueName: \"kubernetes.io/projected/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-kube-api-access-8ggct\") pod \"certified-operators-mnzts\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.134076 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.683541 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-mnzts"] Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.838142 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerStarted","Data":"a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e"} Mar 20 11:31:31 crc kubenswrapper[4772]: I0320 11:31:31.838193 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerStarted","Data":"0b5e7113b670f1da0108092a98d8f32ed878db87145272b9e043f009bb4a6b4c"} Mar 20 11:31:32 crc kubenswrapper[4772]: I0320 11:31:32.845909 4772 generic.go:334] "Generic (PLEG): container finished" podID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerID="a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e" exitCode=0 Mar 20 11:31:32 crc kubenswrapper[4772]: I0320 11:31:32.845968 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerDied","Data":"a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e"} Mar 20 11:31:34 crc kubenswrapper[4772]: I0320 11:31:34.861565 4772 generic.go:334] "Generic (PLEG): container finished" podID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerID="c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f" exitCode=0 Mar 20 11:31:34 crc kubenswrapper[4772]: I0320 11:31:34.861673 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerDied","Data":"c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f"} Mar 20 11:31:35 crc kubenswrapper[4772]: I0320 11:31:35.870345 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerStarted","Data":"722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21"} Mar 20 11:31:35 crc kubenswrapper[4772]: I0320 11:31:35.895480 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-mnzts" podStartSLOduration=3.332611331 podStartE2EDuration="5.895461744s" podCreationTimestamp="2026-03-20 11:31:30 +0000 UTC" firstStartedPulling="2026-03-20 11:31:32.848022044 +0000 UTC m=+2178.938988519" lastFinishedPulling="2026-03-20 11:31:35.410872457 +0000 UTC m=+2181.501838932" observedRunningTime="2026-03-20 11:31:35.891408374 +0000 UTC m=+2181.982374859" watchObservedRunningTime="2026-03-20 11:31:35.895461744 +0000 UTC m=+2181.986428229" Mar 20 11:31:41 crc kubenswrapper[4772]: I0320 11:31:41.135253 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:41 crc kubenswrapper[4772]: I0320 11:31:41.135663 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:41 crc kubenswrapper[4772]: I0320 11:31:41.176575 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:41 crc kubenswrapper[4772]: I0320 11:31:41.956232 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:42 crc kubenswrapper[4772]: I0320 11:31:42.016217 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnzts"] Mar 20 11:31:43 crc kubenswrapper[4772]: I0320 11:31:43.922160 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-mnzts" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="registry-server" containerID="cri-o://722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21" gracePeriod=2 Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.303147 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.396609 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-utilities\") pod \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.396668 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-catalog-content\") pod \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.396716 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ggct\" (UniqueName: \"kubernetes.io/projected/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-kube-api-access-8ggct\") pod \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\" (UID: \"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5\") " Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.397546 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-utilities" (OuterVolumeSpecName: "utilities") pod "a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" (UID: "a9597ede-5b5d-4a2b-9286-f94f4f76f6f5"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.402100 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-kube-api-access-8ggct" (OuterVolumeSpecName: "kube-api-access-8ggct") pod "a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" (UID: "a9597ede-5b5d-4a2b-9286-f94f4f76f6f5"). InnerVolumeSpecName "kube-api-access-8ggct". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.498529 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ggct\" (UniqueName: \"kubernetes.io/projected/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-kube-api-access-8ggct\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.498564 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.931262 4772 generic.go:334] "Generic (PLEG): container finished" podID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerID="722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21" exitCode=0 Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.931310 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerDied","Data":"722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21"} Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.931342 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-mnzts" event={"ID":"a9597ede-5b5d-4a2b-9286-f94f4f76f6f5","Type":"ContainerDied","Data":"0b5e7113b670f1da0108092a98d8f32ed878db87145272b9e043f009bb4a6b4c"} Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.931345 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-mnzts" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.931361 4772 scope.go:117] "RemoveContainer" containerID="722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.954564 4772 scope.go:117] "RemoveContainer" containerID="c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.971166 4772 scope.go:117] "RemoveContainer" containerID="a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.998176 4772 scope.go:117] "RemoveContainer" containerID="722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21" Mar 20 11:31:44 crc kubenswrapper[4772]: E0320 11:31:44.999084 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21\": container with ID starting with 722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21 not found: ID does not exist" containerID="722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.999122 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21"} err="failed to get container status \"722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21\": rpc error: code = NotFound desc = could not find container \"722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21\": container with ID starting with 722e109e386f77fff43e896f92a55bb755eb45fa4f5b8afafb879a82e8184d21 not found: ID does not exist" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.999148 4772 scope.go:117] "RemoveContainer" containerID="c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f" Mar 20 11:31:44 crc kubenswrapper[4772]: E0320 11:31:44.999725 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f\": container with ID starting with c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f not found: ID does not exist" containerID="c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.999748 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f"} err="failed to get container status \"c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f\": rpc error: code = NotFound desc = could not find container \"c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f\": container with ID starting with c69afdd843ad2ec1a22915d9820bc38d6588c35e037ac63a58fb067bedb4fe4f not found: ID does not exist" Mar 20 11:31:44 crc kubenswrapper[4772]: I0320 11:31:44.999762 4772 scope.go:117] "RemoveContainer" containerID="a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e" Mar 20 11:31:45 crc kubenswrapper[4772]: E0320 11:31:45.000152 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e\": container with ID starting with a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e not found: ID does not exist" containerID="a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e" Mar 20 11:31:45 crc kubenswrapper[4772]: I0320 11:31:45.000244 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e"} err="failed to get container status \"a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e\": rpc error: code = NotFound desc = could not find container \"a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e\": container with ID starting with a10e96cb92bad7bea0b168ddc0168e538ae513dead90e95e10a2d8d0e18b595e not found: ID does not exist" Mar 20 11:31:45 crc kubenswrapper[4772]: I0320 11:31:45.464620 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" (UID: "a9597ede-5b5d-4a2b-9286-f94f4f76f6f5"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:31:45 crc kubenswrapper[4772]: I0320 11:31:45.512976 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:31:45 crc kubenswrapper[4772]: I0320 11:31:45.560582 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-mnzts"] Mar 20 11:31:45 crc kubenswrapper[4772]: I0320 11:31:45.565481 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-mnzts"] Mar 20 11:31:46 crc kubenswrapper[4772]: I0320 11:31:46.651482 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" path="/var/lib/kubelet/pods/a9597ede-5b5d-4a2b-9286-f94f4f76f6f5/volumes" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.152830 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566772-f6gdx"] Mar 20 11:32:00 crc kubenswrapper[4772]: E0320 11:32:00.153874 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.153925 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4772]: E0320 11:32:00.153959 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.153973 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="extract-utilities" Mar 20 11:32:00 crc kubenswrapper[4772]: E0320 11:32:00.153998 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.154009 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="extract-content" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.154196 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="a9597ede-5b5d-4a2b-9286-f94f4f76f6f5" containerName="registry-server" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.154777 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.158098 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.158325 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.158794 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.162606 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-f6gdx"] Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.219466 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwdf6\" (UniqueName: \"kubernetes.io/projected/03743166-d880-4e37-affe-51018c1e8a7a-kube-api-access-pwdf6\") pod \"auto-csr-approver-29566772-f6gdx\" (UID: \"03743166-d880-4e37-affe-51018c1e8a7a\") " pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.321771 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwdf6\" (UniqueName: \"kubernetes.io/projected/03743166-d880-4e37-affe-51018c1e8a7a-kube-api-access-pwdf6\") pod \"auto-csr-approver-29566772-f6gdx\" (UID: \"03743166-d880-4e37-affe-51018c1e8a7a\") " pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.345702 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwdf6\" (UniqueName: \"kubernetes.io/projected/03743166-d880-4e37-affe-51018c1e8a7a-kube-api-access-pwdf6\") pod \"auto-csr-approver-29566772-f6gdx\" (UID: \"03743166-d880-4e37-affe-51018c1e8a7a\") " pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.477098 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:00 crc kubenswrapper[4772]: I0320 11:32:00.902428 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-f6gdx"] Mar 20 11:32:01 crc kubenswrapper[4772]: I0320 11:32:01.047186 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" event={"ID":"03743166-d880-4e37-affe-51018c1e8a7a","Type":"ContainerStarted","Data":"95ce2470a1a186524f9a085dd8fe30eeab8846cea8a683a5852b8fde1b0892ff"} Mar 20 11:32:03 crc kubenswrapper[4772]: I0320 11:32:03.061854 4772 generic.go:334] "Generic (PLEG): container finished" podID="03743166-d880-4e37-affe-51018c1e8a7a" containerID="05b4cb3e37865ce1dd37be6be1ab89d4aa658ed0a605f1b034eb639d25fe53ae" exitCode=0 Mar 20 11:32:03 crc kubenswrapper[4772]: I0320 11:32:03.061939 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" event={"ID":"03743166-d880-4e37-affe-51018c1e8a7a","Type":"ContainerDied","Data":"05b4cb3e37865ce1dd37be6be1ab89d4aa658ed0a605f1b034eb639d25fe53ae"} Mar 20 11:32:04 crc kubenswrapper[4772]: I0320 11:32:04.396308 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:04 crc kubenswrapper[4772]: I0320 11:32:04.490068 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pwdf6\" (UniqueName: \"kubernetes.io/projected/03743166-d880-4e37-affe-51018c1e8a7a-kube-api-access-pwdf6\") pod \"03743166-d880-4e37-affe-51018c1e8a7a\" (UID: \"03743166-d880-4e37-affe-51018c1e8a7a\") " Mar 20 11:32:04 crc kubenswrapper[4772]: I0320 11:32:04.497233 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/03743166-d880-4e37-affe-51018c1e8a7a-kube-api-access-pwdf6" (OuterVolumeSpecName: "kube-api-access-pwdf6") pod "03743166-d880-4e37-affe-51018c1e8a7a" (UID: "03743166-d880-4e37-affe-51018c1e8a7a"). InnerVolumeSpecName "kube-api-access-pwdf6". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:04 crc kubenswrapper[4772]: I0320 11:32:04.591922 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pwdf6\" (UniqueName: \"kubernetes.io/projected/03743166-d880-4e37-affe-51018c1e8a7a-kube-api-access-pwdf6\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:05 crc kubenswrapper[4772]: I0320 11:32:05.075820 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" event={"ID":"03743166-d880-4e37-affe-51018c1e8a7a","Type":"ContainerDied","Data":"95ce2470a1a186524f9a085dd8fe30eeab8846cea8a683a5852b8fde1b0892ff"} Mar 20 11:32:05 crc kubenswrapper[4772]: I0320 11:32:05.075875 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="95ce2470a1a186524f9a085dd8fe30eeab8846cea8a683a5852b8fde1b0892ff" Mar 20 11:32:05 crc kubenswrapper[4772]: I0320 11:32:05.075961 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566772-f6gdx" Mar 20 11:32:05 crc kubenswrapper[4772]: I0320 11:32:05.466749 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lh5zc"] Mar 20 11:32:05 crc kubenswrapper[4772]: I0320 11:32:05.472005 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566766-lh5zc"] Mar 20 11:32:06 crc kubenswrapper[4772]: I0320 11:32:06.655161 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5d9984e5-88ba-4310-9ff4-6dfd2e5c4533" path="/var/lib/kubelet/pods/5d9984e5-88ba-4310-9ff4-6dfd2e5c4533/volumes" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.452240 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-xlg82"] Mar 20 11:32:17 crc kubenswrapper[4772]: E0320 11:32:17.453150 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="03743166-d880-4e37-affe-51018c1e8a7a" containerName="oc" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.453166 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="03743166-d880-4e37-affe-51018c1e8a7a" containerName="oc" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.453322 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="03743166-d880-4e37-affe-51018c1e8a7a" containerName="oc" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.454545 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.469263 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlg82"] Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.505921 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-utilities\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.506043 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-catalog-content\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.506088 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mmvtc\" (UniqueName: \"kubernetes.io/projected/e376a0fe-8508-42e9-8b1f-f741a4c25817-kube-api-access-mmvtc\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.606877 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-catalog-content\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.606919 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mmvtc\" (UniqueName: \"kubernetes.io/projected/e376a0fe-8508-42e9-8b1f-f741a4c25817-kube-api-access-mmvtc\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.606983 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-utilities\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.607391 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-catalog-content\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.607408 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-utilities\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.625729 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mmvtc\" (UniqueName: \"kubernetes.io/projected/e376a0fe-8508-42e9-8b1f-f741a4c25817-kube-api-access-mmvtc\") pod \"community-operators-xlg82\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:17 crc kubenswrapper[4772]: I0320 11:32:17.776452 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:18 crc kubenswrapper[4772]: I0320 11:32:18.269120 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-xlg82"] Mar 20 11:32:19 crc kubenswrapper[4772]: I0320 11:32:19.180883 4772 generic.go:334] "Generic (PLEG): container finished" podID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerID="1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a" exitCode=0 Mar 20 11:32:19 crc kubenswrapper[4772]: I0320 11:32:19.180930 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlg82" event={"ID":"e376a0fe-8508-42e9-8b1f-f741a4c25817","Type":"ContainerDied","Data":"1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a"} Mar 20 11:32:19 crc kubenswrapper[4772]: I0320 11:32:19.181150 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlg82" event={"ID":"e376a0fe-8508-42e9-8b1f-f741a4c25817","Type":"ContainerStarted","Data":"9f09bd0fb821033adca51dfd45681a18faba25cce32c0a462b6a493b7642ffaa"} Mar 20 11:32:21 crc kubenswrapper[4772]: I0320 11:32:21.216812 4772 generic.go:334] "Generic (PLEG): container finished" podID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerID="4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f" exitCode=0 Mar 20 11:32:21 crc kubenswrapper[4772]: I0320 11:32:21.217196 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlg82" event={"ID":"e376a0fe-8508-42e9-8b1f-f741a4c25817","Type":"ContainerDied","Data":"4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f"} Mar 20 11:32:22 crc kubenswrapper[4772]: I0320 11:32:22.227757 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlg82" event={"ID":"e376a0fe-8508-42e9-8b1f-f741a4c25817","Type":"ContainerStarted","Data":"dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b"} Mar 20 11:32:22 crc kubenswrapper[4772]: I0320 11:32:22.249652 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-xlg82" podStartSLOduration=2.83493672 podStartE2EDuration="5.24962874s" podCreationTimestamp="2026-03-20 11:32:17 +0000 UTC" firstStartedPulling="2026-03-20 11:32:19.182897599 +0000 UTC m=+2225.273864084" lastFinishedPulling="2026-03-20 11:32:21.597589609 +0000 UTC m=+2227.688556104" observedRunningTime="2026-03-20 11:32:22.246607988 +0000 UTC m=+2228.337574513" watchObservedRunningTime="2026-03-20 11:32:22.24962874 +0000 UTC m=+2228.340595245" Mar 20 11:32:27 crc kubenswrapper[4772]: I0320 11:32:27.785062 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:27 crc kubenswrapper[4772]: I0320 11:32:27.785424 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:27 crc kubenswrapper[4772]: I0320 11:32:27.826643 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:28 crc kubenswrapper[4772]: I0320 11:32:28.343048 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:28 crc kubenswrapper[4772]: I0320 11:32:28.398579 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlg82"] Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.290204 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-xlg82" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="registry-server" containerID="cri-o://dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b" gracePeriod=2 Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.783128 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.813808 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-catalog-content\") pod \"e376a0fe-8508-42e9-8b1f-f741a4c25817\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.813888 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-utilities\") pod \"e376a0fe-8508-42e9-8b1f-f741a4c25817\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.813976 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mmvtc\" (UniqueName: \"kubernetes.io/projected/e376a0fe-8508-42e9-8b1f-f741a4c25817-kube-api-access-mmvtc\") pod \"e376a0fe-8508-42e9-8b1f-f741a4c25817\" (UID: \"e376a0fe-8508-42e9-8b1f-f741a4c25817\") " Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.843042 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-utilities" (OuterVolumeSpecName: "utilities") pod "e376a0fe-8508-42e9-8b1f-f741a4c25817" (UID: "e376a0fe-8508-42e9-8b1f-f741a4c25817"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.849736 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e376a0fe-8508-42e9-8b1f-f741a4c25817-kube-api-access-mmvtc" (OuterVolumeSpecName: "kube-api-access-mmvtc") pod "e376a0fe-8508-42e9-8b1f-f741a4c25817" (UID: "e376a0fe-8508-42e9-8b1f-f741a4c25817"). InnerVolumeSpecName "kube-api-access-mmvtc". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.902551 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e376a0fe-8508-42e9-8b1f-f741a4c25817" (UID: "e376a0fe-8508-42e9-8b1f-f741a4c25817"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.915901 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.915948 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mmvtc\" (UniqueName: \"kubernetes.io/projected/e376a0fe-8508-42e9-8b1f-f741a4c25817-kube-api-access-mmvtc\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:30 crc kubenswrapper[4772]: I0320 11:32:30.915960 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e376a0fe-8508-42e9-8b1f-f741a4c25817-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.299464 4772 generic.go:334] "Generic (PLEG): container finished" podID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerID="dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b" exitCode=0 Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.299519 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-xlg82" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.299534 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlg82" event={"ID":"e376a0fe-8508-42e9-8b1f-f741a4c25817","Type":"ContainerDied","Data":"dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b"} Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.299848 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-xlg82" event={"ID":"e376a0fe-8508-42e9-8b1f-f741a4c25817","Type":"ContainerDied","Data":"9f09bd0fb821033adca51dfd45681a18faba25cce32c0a462b6a493b7642ffaa"} Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.299877 4772 scope.go:117] "RemoveContainer" containerID="dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.321090 4772 scope.go:117] "RemoveContainer" containerID="4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.334422 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-xlg82"] Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.338907 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-xlg82"] Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.367765 4772 scope.go:117] "RemoveContainer" containerID="1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.386000 4772 scope.go:117] "RemoveContainer" containerID="dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b" Mar 20 11:32:31 crc kubenswrapper[4772]: E0320 11:32:31.386534 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b\": container with ID starting with dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b not found: ID does not exist" containerID="dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.386583 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b"} err="failed to get container status \"dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b\": rpc error: code = NotFound desc = could not find container \"dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b\": container with ID starting with dd319d26f9f2c911037b8b6d7ad611244d568483697483c2ad657e9316027d9b not found: ID does not exist" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.386613 4772 scope.go:117] "RemoveContainer" containerID="4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f" Mar 20 11:32:31 crc kubenswrapper[4772]: E0320 11:32:31.387040 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f\": container with ID starting with 4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f not found: ID does not exist" containerID="4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.387132 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f"} err="failed to get container status \"4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f\": rpc error: code = NotFound desc = could not find container \"4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f\": container with ID starting with 4fca25b43eb6a207c289834202e4f7e463c3530a2a7d3c8b8d1b8595a8d77c6f not found: ID does not exist" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.387168 4772 scope.go:117] "RemoveContainer" containerID="1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a" Mar 20 11:32:31 crc kubenswrapper[4772]: E0320 11:32:31.388016 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a\": container with ID starting with 1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a not found: ID does not exist" containerID="1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a" Mar 20 11:32:31 crc kubenswrapper[4772]: I0320 11:32:31.388048 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a"} err="failed to get container status \"1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a\": rpc error: code = NotFound desc = could not find container \"1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a\": container with ID starting with 1a4107cfba1674fba9cf3e62f5da503b476eebcc1fb04b3a2c93844354b74c4a not found: ID does not exist" Mar 20 11:32:32 crc kubenswrapper[4772]: I0320 11:32:32.659524 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" path="/var/lib/kubelet/pods/e376a0fe-8508-42e9-8b1f-f741a4c25817/volumes" Mar 20 11:32:39 crc kubenswrapper[4772]: I0320 11:32:39.564108 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:32:39 crc kubenswrapper[4772]: I0320 11:32:39.564895 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.027252 4772 scope.go:117] "RemoveContainer" containerID="1618b2250445d384bbeec852ae4b2085c7bea901effba1e8ef6cc1d84393a0f4" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.227787 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2mnvh"] Mar 20 11:32:40 crc kubenswrapper[4772]: E0320 11:32:40.228823 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="registry-server" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.228878 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="registry-server" Mar 20 11:32:40 crc kubenswrapper[4772]: E0320 11:32:40.228929 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="extract-utilities" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.228945 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="extract-utilities" Mar 20 11:32:40 crc kubenswrapper[4772]: E0320 11:32:40.228978 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="extract-content" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.228991 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="extract-content" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.229250 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e376a0fe-8508-42e9-8b1f-f741a4c25817" containerName="registry-server" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.231691 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.253031 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mnvh"] Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.393263 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-catalog-content\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.393311 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-utilities\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.393345 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zbspt\" (UniqueName: \"kubernetes.io/projected/c30a24d6-519b-420b-9d81-e1056a903c20-kube-api-access-zbspt\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.494378 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-catalog-content\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.494431 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-utilities\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.494463 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zbspt\" (UniqueName: \"kubernetes.io/projected/c30a24d6-519b-420b-9d81-e1056a903c20-kube-api-access-zbspt\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.494872 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-utilities\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.495110 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-catalog-content\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.523898 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zbspt\" (UniqueName: \"kubernetes.io/projected/c30a24d6-519b-420b-9d81-e1056a903c20-kube-api-access-zbspt\") pod \"redhat-marketplace-2mnvh\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:40 crc kubenswrapper[4772]: I0320 11:32:40.597742 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:41 crc kubenswrapper[4772]: I0320 11:32:41.067144 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mnvh"] Mar 20 11:32:41 crc kubenswrapper[4772]: W0320 11:32:41.078255 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc30a24d6_519b_420b_9d81_e1056a903c20.slice/crio-481cab5dcff44a21d01029a2db0137d4bc2af0648808043a2db9d11b530e9795 WatchSource:0}: Error finding container 481cab5dcff44a21d01029a2db0137d4bc2af0648808043a2db9d11b530e9795: Status 404 returned error can't find the container with id 481cab5dcff44a21d01029a2db0137d4bc2af0648808043a2db9d11b530e9795 Mar 20 11:32:41 crc kubenswrapper[4772]: I0320 11:32:41.381582 4772 generic.go:334] "Generic (PLEG): container finished" podID="c30a24d6-519b-420b-9d81-e1056a903c20" containerID="2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a" exitCode=0 Mar 20 11:32:41 crc kubenswrapper[4772]: I0320 11:32:41.382027 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mnvh" event={"ID":"c30a24d6-519b-420b-9d81-e1056a903c20","Type":"ContainerDied","Data":"2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a"} Mar 20 11:32:41 crc kubenswrapper[4772]: I0320 11:32:41.382080 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mnvh" event={"ID":"c30a24d6-519b-420b-9d81-e1056a903c20","Type":"ContainerStarted","Data":"481cab5dcff44a21d01029a2db0137d4bc2af0648808043a2db9d11b530e9795"} Mar 20 11:32:43 crc kubenswrapper[4772]: I0320 11:32:43.404936 4772 generic.go:334] "Generic (PLEG): container finished" podID="c30a24d6-519b-420b-9d81-e1056a903c20" containerID="4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d" exitCode=0 Mar 20 11:32:43 crc kubenswrapper[4772]: I0320 11:32:43.405693 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mnvh" event={"ID":"c30a24d6-519b-420b-9d81-e1056a903c20","Type":"ContainerDied","Data":"4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d"} Mar 20 11:32:44 crc kubenswrapper[4772]: I0320 11:32:44.416913 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mnvh" event={"ID":"c30a24d6-519b-420b-9d81-e1056a903c20","Type":"ContainerStarted","Data":"798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a"} Mar 20 11:32:44 crc kubenswrapper[4772]: I0320 11:32:44.434918 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2mnvh" podStartSLOduration=1.918455351 podStartE2EDuration="4.434897709s" podCreationTimestamp="2026-03-20 11:32:40 +0000 UTC" firstStartedPulling="2026-03-20 11:32:41.38507075 +0000 UTC m=+2247.476037275" lastFinishedPulling="2026-03-20 11:32:43.901513138 +0000 UTC m=+2249.992479633" observedRunningTime="2026-03-20 11:32:44.431761974 +0000 UTC m=+2250.522728459" watchObservedRunningTime="2026-03-20 11:32:44.434897709 +0000 UTC m=+2250.525864194" Mar 20 11:32:50 crc kubenswrapper[4772]: I0320 11:32:50.598328 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:50 crc kubenswrapper[4772]: I0320 11:32:50.598928 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:50 crc kubenswrapper[4772]: I0320 11:32:50.660530 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:51 crc kubenswrapper[4772]: I0320 11:32:51.524117 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:52 crc kubenswrapper[4772]: I0320 11:32:52.455607 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mnvh"] Mar 20 11:32:53 crc kubenswrapper[4772]: I0320 11:32:53.489759 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2mnvh" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="registry-server" containerID="cri-o://798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a" gracePeriod=2 Mar 20 11:32:53 crc kubenswrapper[4772]: I0320 11:32:53.907184 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:53 crc kubenswrapper[4772]: I0320 11:32:53.994427 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-utilities\") pod \"c30a24d6-519b-420b-9d81-e1056a903c20\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " Mar 20 11:32:53 crc kubenswrapper[4772]: I0320 11:32:53.994535 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zbspt\" (UniqueName: \"kubernetes.io/projected/c30a24d6-519b-420b-9d81-e1056a903c20-kube-api-access-zbspt\") pod \"c30a24d6-519b-420b-9d81-e1056a903c20\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " Mar 20 11:32:53 crc kubenswrapper[4772]: I0320 11:32:53.994795 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-catalog-content\") pod \"c30a24d6-519b-420b-9d81-e1056a903c20\" (UID: \"c30a24d6-519b-420b-9d81-e1056a903c20\") " Mar 20 11:32:53 crc kubenswrapper[4772]: I0320 11:32:53.995457 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-utilities" (OuterVolumeSpecName: "utilities") pod "c30a24d6-519b-420b-9d81-e1056a903c20" (UID: "c30a24d6-519b-420b-9d81-e1056a903c20"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.004721 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c30a24d6-519b-420b-9d81-e1056a903c20-kube-api-access-zbspt" (OuterVolumeSpecName: "kube-api-access-zbspt") pod "c30a24d6-519b-420b-9d81-e1056a903c20" (UID: "c30a24d6-519b-420b-9d81-e1056a903c20"). InnerVolumeSpecName "kube-api-access-zbspt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.097930 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zbspt\" (UniqueName: \"kubernetes.io/projected/c30a24d6-519b-420b-9d81-e1056a903c20-kube-api-access-zbspt\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.098031 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.500954 4772 generic.go:334] "Generic (PLEG): container finished" podID="c30a24d6-519b-420b-9d81-e1056a903c20" containerID="798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a" exitCode=0 Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.501054 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mnvh" event={"ID":"c30a24d6-519b-420b-9d81-e1056a903c20","Type":"ContainerDied","Data":"798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a"} Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.501133 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2mnvh" event={"ID":"c30a24d6-519b-420b-9d81-e1056a903c20","Type":"ContainerDied","Data":"481cab5dcff44a21d01029a2db0137d4bc2af0648808043a2db9d11b530e9795"} Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.501178 4772 scope.go:117] "RemoveContainer" containerID="798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.501007 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2mnvh" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.529539 4772 scope.go:117] "RemoveContainer" containerID="4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.552289 4772 scope.go:117] "RemoveContainer" containerID="2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.602540 4772 scope.go:117] "RemoveContainer" containerID="798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a" Mar 20 11:32:54 crc kubenswrapper[4772]: E0320 11:32:54.603197 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a\": container with ID starting with 798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a not found: ID does not exist" containerID="798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.603395 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a"} err="failed to get container status \"798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a\": rpc error: code = NotFound desc = could not find container \"798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a\": container with ID starting with 798ae298bfa9fb90509de62632c8b5b54b499778f2fda905bbdd98c82e0a8d0a not found: ID does not exist" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.603446 4772 scope.go:117] "RemoveContainer" containerID="4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d" Mar 20 11:32:54 crc kubenswrapper[4772]: E0320 11:32:54.603938 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d\": container with ID starting with 4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d not found: ID does not exist" containerID="4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.604012 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d"} err="failed to get container status \"4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d\": rpc error: code = NotFound desc = could not find container \"4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d\": container with ID starting with 4065f67cf86246f34569ebfb2c3ffcc09391882ae3bd5fdc012b9e7af3a4bd9d not found: ID does not exist" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.604061 4772 scope.go:117] "RemoveContainer" containerID="2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a" Mar 20 11:32:54 crc kubenswrapper[4772]: E0320 11:32:54.604519 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a\": container with ID starting with 2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a not found: ID does not exist" containerID="2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a" Mar 20 11:32:54 crc kubenswrapper[4772]: I0320 11:32:54.604546 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a"} err="failed to get container status \"2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a\": rpc error: code = NotFound desc = could not find container \"2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a\": container with ID starting with 2fd4504019f3999c0c3310b3160dfd9afcf6fd069917d2ffd56d95a78ece080a not found: ID does not exist" Mar 20 11:32:56 crc kubenswrapper[4772]: I0320 11:32:56.270272 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c30a24d6-519b-420b-9d81-e1056a903c20" (UID: "c30a24d6-519b-420b-9d81-e1056a903c20"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:32:56 crc kubenswrapper[4772]: I0320 11:32:56.338899 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c30a24d6-519b-420b-9d81-e1056a903c20-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:32:56 crc kubenswrapper[4772]: I0320 11:32:56.365985 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mnvh"] Mar 20 11:32:56 crc kubenswrapper[4772]: I0320 11:32:56.375558 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2mnvh"] Mar 20 11:32:56 crc kubenswrapper[4772]: I0320 11:32:56.663206 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" path="/var/lib/kubelet/pods/c30a24d6-519b-420b-9d81-e1056a903c20/volumes" Mar 20 11:33:09 crc kubenswrapper[4772]: I0320 11:33:09.564708 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:33:09 crc kubenswrapper[4772]: I0320 11:33:09.565286 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:33:39 crc kubenswrapper[4772]: I0320 11:33:39.564598 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:33:39 crc kubenswrapper[4772]: I0320 11:33:39.565165 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:33:39 crc kubenswrapper[4772]: I0320 11:33:39.565214 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:33:39 crc kubenswrapper[4772]: I0320 11:33:39.565842 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:33:39 crc kubenswrapper[4772]: I0320 11:33:39.565922 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" gracePeriod=600 Mar 20 11:33:39 crc kubenswrapper[4772]: E0320 11:33:39.887425 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:33:40 crc kubenswrapper[4772]: I0320 11:33:40.846333 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" exitCode=0 Mar 20 11:33:40 crc kubenswrapper[4772]: I0320 11:33:40.846381 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af"} Mar 20 11:33:40 crc kubenswrapper[4772]: I0320 11:33:40.846411 4772 scope.go:117] "RemoveContainer" containerID="012704aade0ad5a7f65fd738d00c4c74639b01c5992da616119c8257b71293ae" Mar 20 11:33:40 crc kubenswrapper[4772]: I0320 11:33:40.846922 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:33:40 crc kubenswrapper[4772]: E0320 11:33:40.847129 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:33:54 crc kubenswrapper[4772]: I0320 11:33:54.648782 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:33:54 crc kubenswrapper[4772]: E0320 11:33:54.649657 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.154724 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566774-z6xh6"] Mar 20 11:34:00 crc kubenswrapper[4772]: E0320 11:34:00.155442 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.155458 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="extract-utilities" Mar 20 11:34:00 crc kubenswrapper[4772]: E0320 11:34:00.155492 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.155500 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4772]: E0320 11:34:00.155522 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.155530 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="extract-content" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.155720 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="c30a24d6-519b-420b-9d81-e1056a903c20" containerName="registry-server" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.156350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.159651 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.159935 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-z6xh6"] Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.159983 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.160394 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.281777 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-84djb\" (UniqueName: \"kubernetes.io/projected/aec50b8d-6d2a-4721-8812-b878368cc151-kube-api-access-84djb\") pod \"auto-csr-approver-29566774-z6xh6\" (UID: \"aec50b8d-6d2a-4721-8812-b878368cc151\") " pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.382795 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-84djb\" (UniqueName: \"kubernetes.io/projected/aec50b8d-6d2a-4721-8812-b878368cc151-kube-api-access-84djb\") pod \"auto-csr-approver-29566774-z6xh6\" (UID: \"aec50b8d-6d2a-4721-8812-b878368cc151\") " pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.407558 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-84djb\" (UniqueName: \"kubernetes.io/projected/aec50b8d-6d2a-4721-8812-b878368cc151-kube-api-access-84djb\") pod \"auto-csr-approver-29566774-z6xh6\" (UID: \"aec50b8d-6d2a-4721-8812-b878368cc151\") " pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.482758 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.922563 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-z6xh6"] Mar 20 11:34:00 crc kubenswrapper[4772]: I0320 11:34:00.937626 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:34:01 crc kubenswrapper[4772]: I0320 11:34:01.032333 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" event={"ID":"aec50b8d-6d2a-4721-8812-b878368cc151","Type":"ContainerStarted","Data":"e36358f8e5ee32e39bc0a7a02761db06f5562792d41c1634cf4aeb20993cdf30"} Mar 20 11:34:03 crc kubenswrapper[4772]: I0320 11:34:03.048090 4772 generic.go:334] "Generic (PLEG): container finished" podID="aec50b8d-6d2a-4721-8812-b878368cc151" containerID="76e0fa81fbd8b238cac7afc60648b6a1c3d3d0d7bb5d96d50d72efcb02e8fe79" exitCode=0 Mar 20 11:34:03 crc kubenswrapper[4772]: I0320 11:34:03.048168 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" event={"ID":"aec50b8d-6d2a-4721-8812-b878368cc151","Type":"ContainerDied","Data":"76e0fa81fbd8b238cac7afc60648b6a1c3d3d0d7bb5d96d50d72efcb02e8fe79"} Mar 20 11:34:04 crc kubenswrapper[4772]: I0320 11:34:04.298785 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:04 crc kubenswrapper[4772]: I0320 11:34:04.437467 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-84djb\" (UniqueName: \"kubernetes.io/projected/aec50b8d-6d2a-4721-8812-b878368cc151-kube-api-access-84djb\") pod \"aec50b8d-6d2a-4721-8812-b878368cc151\" (UID: \"aec50b8d-6d2a-4721-8812-b878368cc151\") " Mar 20 11:34:04 crc kubenswrapper[4772]: I0320 11:34:04.446509 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/aec50b8d-6d2a-4721-8812-b878368cc151-kube-api-access-84djb" (OuterVolumeSpecName: "kube-api-access-84djb") pod "aec50b8d-6d2a-4721-8812-b878368cc151" (UID: "aec50b8d-6d2a-4721-8812-b878368cc151"). InnerVolumeSpecName "kube-api-access-84djb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:34:04 crc kubenswrapper[4772]: I0320 11:34:04.538868 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-84djb\" (UniqueName: \"kubernetes.io/projected/aec50b8d-6d2a-4721-8812-b878368cc151-kube-api-access-84djb\") on node \"crc\" DevicePath \"\"" Mar 20 11:34:05 crc kubenswrapper[4772]: I0320 11:34:05.063203 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" event={"ID":"aec50b8d-6d2a-4721-8812-b878368cc151","Type":"ContainerDied","Data":"e36358f8e5ee32e39bc0a7a02761db06f5562792d41c1634cf4aeb20993cdf30"} Mar 20 11:34:05 crc kubenswrapper[4772]: I0320 11:34:05.063895 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="e36358f8e5ee32e39bc0a7a02761db06f5562792d41c1634cf4aeb20993cdf30" Mar 20 11:34:05 crc kubenswrapper[4772]: I0320 11:34:05.063315 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566774-z6xh6" Mar 20 11:34:05 crc kubenswrapper[4772]: I0320 11:34:05.364442 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-pzg2n"] Mar 20 11:34:05 crc kubenswrapper[4772]: I0320 11:34:05.370608 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566768-pzg2n"] Mar 20 11:34:06 crc kubenswrapper[4772]: I0320 11:34:06.641311 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:34:06 crc kubenswrapper[4772]: E0320 11:34:06.641523 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:34:06 crc kubenswrapper[4772]: I0320 11:34:06.652520 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b88b4d3f-8e53-48d4-a044-5f4d87650794" path="/var/lib/kubelet/pods/b88b4d3f-8e53-48d4-a044-5f4d87650794/volumes" Mar 20 11:34:21 crc kubenswrapper[4772]: I0320 11:34:21.641552 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:34:21 crc kubenswrapper[4772]: E0320 11:34:21.642323 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:34:32 crc kubenswrapper[4772]: I0320 11:34:32.643267 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:34:32 crc kubenswrapper[4772]: E0320 11:34:32.644464 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:34:40 crc kubenswrapper[4772]: I0320 11:34:40.159116 4772 scope.go:117] "RemoveContainer" containerID="92234d9f503d0f2c5a33766a44c27144b02a161a260fa7110e1f095c0299300e" Mar 20 11:34:46 crc kubenswrapper[4772]: I0320 11:34:46.641549 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:34:46 crc kubenswrapper[4772]: E0320 11:34:46.642252 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:35:00 crc kubenswrapper[4772]: I0320 11:35:00.642025 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:35:00 crc kubenswrapper[4772]: E0320 11:35:00.643668 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:35:15 crc kubenswrapper[4772]: I0320 11:35:15.641791 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:35:15 crc kubenswrapper[4772]: E0320 11:35:15.642587 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:35:27 crc kubenswrapper[4772]: I0320 11:35:27.642675 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:35:27 crc kubenswrapper[4772]: E0320 11:35:27.643774 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:35:39 crc kubenswrapper[4772]: I0320 11:35:39.641819 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:35:39 crc kubenswrapper[4772]: E0320 11:35:39.642448 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:35:54 crc kubenswrapper[4772]: I0320 11:35:54.647263 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:35:54 crc kubenswrapper[4772]: E0320 11:35:54.648130 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.138621 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566776-mv89f"] Mar 20 11:36:00 crc kubenswrapper[4772]: E0320 11:36:00.140194 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="aec50b8d-6d2a-4721-8812-b878368cc151" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.140278 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="aec50b8d-6d2a-4721-8812-b878368cc151" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.140464 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="aec50b8d-6d2a-4721-8812-b878368cc151" containerName="oc" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.141221 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.143127 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.143292 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d2rs5\" (UniqueName: \"kubernetes.io/projected/49979ceb-586d-484a-96cf-989cf2703da2-kube-api-access-d2rs5\") pod \"auto-csr-approver-29566776-mv89f\" (UID: \"49979ceb-586d-484a-96cf-989cf2703da2\") " pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.143385 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.143551 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.160580 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-mv89f"] Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.244659 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d2rs5\" (UniqueName: \"kubernetes.io/projected/49979ceb-586d-484a-96cf-989cf2703da2-kube-api-access-d2rs5\") pod \"auto-csr-approver-29566776-mv89f\" (UID: \"49979ceb-586d-484a-96cf-989cf2703da2\") " pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.263889 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d2rs5\" (UniqueName: \"kubernetes.io/projected/49979ceb-586d-484a-96cf-989cf2703da2-kube-api-access-d2rs5\") pod \"auto-csr-approver-29566776-mv89f\" (UID: \"49979ceb-586d-484a-96cf-989cf2703da2\") " pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.461805 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.894887 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-mv89f"] Mar 20 11:36:00 crc kubenswrapper[4772]: I0320 11:36:00.911436 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-mv89f" event={"ID":"49979ceb-586d-484a-96cf-989cf2703da2","Type":"ContainerStarted","Data":"063e0d4542d0d919f7725380d8fe54caca9c42d067f6e93490a228b2125ea278"} Mar 20 11:36:02 crc kubenswrapper[4772]: I0320 11:36:02.926474 4772 generic.go:334] "Generic (PLEG): container finished" podID="49979ceb-586d-484a-96cf-989cf2703da2" containerID="97a2d83cf1b1364d878a5ecc4d9ce90bf7e072df8da68221e2a60268f8f594b7" exitCode=0 Mar 20 11:36:02 crc kubenswrapper[4772]: I0320 11:36:02.926921 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-mv89f" event={"ID":"49979ceb-586d-484a-96cf-989cf2703da2","Type":"ContainerDied","Data":"97a2d83cf1b1364d878a5ecc4d9ce90bf7e072df8da68221e2a60268f8f594b7"} Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.185265 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.305010 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d2rs5\" (UniqueName: \"kubernetes.io/projected/49979ceb-586d-484a-96cf-989cf2703da2-kube-api-access-d2rs5\") pod \"49979ceb-586d-484a-96cf-989cf2703da2\" (UID: \"49979ceb-586d-484a-96cf-989cf2703da2\") " Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.310307 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49979ceb-586d-484a-96cf-989cf2703da2-kube-api-access-d2rs5" (OuterVolumeSpecName: "kube-api-access-d2rs5") pod "49979ceb-586d-484a-96cf-989cf2703da2" (UID: "49979ceb-586d-484a-96cf-989cf2703da2"). InnerVolumeSpecName "kube-api-access-d2rs5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.406907 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d2rs5\" (UniqueName: \"kubernetes.io/projected/49979ceb-586d-484a-96cf-989cf2703da2-kube-api-access-d2rs5\") on node \"crc\" DevicePath \"\"" Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.942630 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566776-mv89f" event={"ID":"49979ceb-586d-484a-96cf-989cf2703da2","Type":"ContainerDied","Data":"063e0d4542d0d919f7725380d8fe54caca9c42d067f6e93490a228b2125ea278"} Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.943013 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="063e0d4542d0d919f7725380d8fe54caca9c42d067f6e93490a228b2125ea278" Mar 20 11:36:04 crc kubenswrapper[4772]: I0320 11:36:04.942675 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566776-mv89f" Mar 20 11:36:05 crc kubenswrapper[4772]: I0320 11:36:05.256579 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gxx2d"] Mar 20 11:36:05 crc kubenswrapper[4772]: I0320 11:36:05.266718 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566770-gxx2d"] Mar 20 11:36:06 crc kubenswrapper[4772]: I0320 11:36:06.650499 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8077500d-1fed-4875-b8ba-eef3c71b5516" path="/var/lib/kubelet/pods/8077500d-1fed-4875-b8ba-eef3c71b5516/volumes" Mar 20 11:36:07 crc kubenswrapper[4772]: I0320 11:36:07.641966 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:36:07 crc kubenswrapper[4772]: E0320 11:36:07.642236 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:36:20 crc kubenswrapper[4772]: I0320 11:36:20.641780 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:36:20 crc kubenswrapper[4772]: E0320 11:36:20.642628 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:36:33 crc kubenswrapper[4772]: I0320 11:36:33.641809 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:36:33 crc kubenswrapper[4772]: E0320 11:36:33.642636 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:36:40 crc kubenswrapper[4772]: I0320 11:36:40.240368 4772 scope.go:117] "RemoveContainer" containerID="cc33e3c5f46b6c4f651ee45545f44cf72280e989e57c75640edf629b3c4a2625" Mar 20 11:36:45 crc kubenswrapper[4772]: I0320 11:36:45.642424 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:36:45 crc kubenswrapper[4772]: E0320 11:36:45.643690 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:36:58 crc kubenswrapper[4772]: I0320 11:36:58.641501 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:36:58 crc kubenswrapper[4772]: E0320 11:36:58.643321 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:37:11 crc kubenswrapper[4772]: I0320 11:37:11.642357 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:37:11 crc kubenswrapper[4772]: E0320 11:37:11.643136 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:37:23 crc kubenswrapper[4772]: I0320 11:37:23.642728 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:37:23 crc kubenswrapper[4772]: E0320 11:37:23.645268 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:37:35 crc kubenswrapper[4772]: I0320 11:37:35.642264 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:37:35 crc kubenswrapper[4772]: E0320 11:37:35.643114 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:37:49 crc kubenswrapper[4772]: I0320 11:37:49.642129 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:37:49 crc kubenswrapper[4772]: E0320 11:37:49.643174 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.144809 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566778-pzbwx"] Mar 20 11:38:00 crc kubenswrapper[4772]: E0320 11:38:00.146304 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="49979ceb-586d-484a-96cf-989cf2703da2" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.146326 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="49979ceb-586d-484a-96cf-989cf2703da2" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.146633 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="49979ceb-586d-484a-96cf-989cf2703da2" containerName="oc" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.147506 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.151598 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.151676 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.151771 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-pzbwx"] Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.151954 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.241748 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rsc7p\" (UniqueName: \"kubernetes.io/projected/047fb828-0158-4d01-b9eb-531e3399a2ff-kube-api-access-rsc7p\") pod \"auto-csr-approver-29566778-pzbwx\" (UID: \"047fb828-0158-4d01-b9eb-531e3399a2ff\") " pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.343909 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rsc7p\" (UniqueName: \"kubernetes.io/projected/047fb828-0158-4d01-b9eb-531e3399a2ff-kube-api-access-rsc7p\") pod \"auto-csr-approver-29566778-pzbwx\" (UID: \"047fb828-0158-4d01-b9eb-531e3399a2ff\") " pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.368946 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rsc7p\" (UniqueName: \"kubernetes.io/projected/047fb828-0158-4d01-b9eb-531e3399a2ff-kube-api-access-rsc7p\") pod \"auto-csr-approver-29566778-pzbwx\" (UID: \"047fb828-0158-4d01-b9eb-531e3399a2ff\") " pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.468648 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.644168 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:38:00 crc kubenswrapper[4772]: E0320 11:38:00.644992 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:38:00 crc kubenswrapper[4772]: I0320 11:38:00.995688 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-pzbwx"] Mar 20 11:38:01 crc kubenswrapper[4772]: I0320 11:38:01.844800 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" event={"ID":"047fb828-0158-4d01-b9eb-531e3399a2ff","Type":"ContainerStarted","Data":"ba93de86419f1eb3d18d27700a8b992e3d13b8f8d1ca3b6e896ca4f45431f5ba"} Mar 20 11:38:02 crc kubenswrapper[4772]: I0320 11:38:02.854138 4772 generic.go:334] "Generic (PLEG): container finished" podID="047fb828-0158-4d01-b9eb-531e3399a2ff" containerID="faf58613ba231ffd21859cedac83c6ec9658264f7b5daa5020df45c2b1191b66" exitCode=0 Mar 20 11:38:02 crc kubenswrapper[4772]: I0320 11:38:02.854251 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" event={"ID":"047fb828-0158-4d01-b9eb-531e3399a2ff","Type":"ContainerDied","Data":"faf58613ba231ffd21859cedac83c6ec9658264f7b5daa5020df45c2b1191b66"} Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.111487 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.215429 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rsc7p\" (UniqueName: \"kubernetes.io/projected/047fb828-0158-4d01-b9eb-531e3399a2ff-kube-api-access-rsc7p\") pod \"047fb828-0158-4d01-b9eb-531e3399a2ff\" (UID: \"047fb828-0158-4d01-b9eb-531e3399a2ff\") " Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.220667 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/047fb828-0158-4d01-b9eb-531e3399a2ff-kube-api-access-rsc7p" (OuterVolumeSpecName: "kube-api-access-rsc7p") pod "047fb828-0158-4d01-b9eb-531e3399a2ff" (UID: "047fb828-0158-4d01-b9eb-531e3399a2ff"). InnerVolumeSpecName "kube-api-access-rsc7p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.317087 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rsc7p\" (UniqueName: \"kubernetes.io/projected/047fb828-0158-4d01-b9eb-531e3399a2ff-kube-api-access-rsc7p\") on node \"crc\" DevicePath \"\"" Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.869632 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" event={"ID":"047fb828-0158-4d01-b9eb-531e3399a2ff","Type":"ContainerDied","Data":"ba93de86419f1eb3d18d27700a8b992e3d13b8f8d1ca3b6e896ca4f45431f5ba"} Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.869671 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ba93de86419f1eb3d18d27700a8b992e3d13b8f8d1ca3b6e896ca4f45431f5ba" Mar 20 11:38:04 crc kubenswrapper[4772]: I0320 11:38:04.869707 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566778-pzbwx" Mar 20 11:38:05 crc kubenswrapper[4772]: I0320 11:38:05.173489 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-f6gdx"] Mar 20 11:38:05 crc kubenswrapper[4772]: I0320 11:38:05.178881 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566772-f6gdx"] Mar 20 11:38:06 crc kubenswrapper[4772]: I0320 11:38:06.650259 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="03743166-d880-4e37-affe-51018c1e8a7a" path="/var/lib/kubelet/pods/03743166-d880-4e37-affe-51018c1e8a7a/volumes" Mar 20 11:38:15 crc kubenswrapper[4772]: I0320 11:38:15.642768 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:38:15 crc kubenswrapper[4772]: E0320 11:38:15.644520 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:38:27 crc kubenswrapper[4772]: I0320 11:38:27.642067 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:38:27 crc kubenswrapper[4772]: E0320 11:38:27.642757 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:38:40 crc kubenswrapper[4772]: I0320 11:38:40.318639 4772 scope.go:117] "RemoveContainer" containerID="05b4cb3e37865ce1dd37be6be1ab89d4aa658ed0a605f1b034eb639d25fe53ae" Mar 20 11:38:41 crc kubenswrapper[4772]: I0320 11:38:41.642505 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:38:42 crc kubenswrapper[4772]: I0320 11:38:42.150589 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"d1c815217e6d78011cef6fae18a43310abbde6caa4920ff34c00cf94580234b9"} Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.142530 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566780-pxflm"] Mar 20 11:40:00 crc kubenswrapper[4772]: E0320 11:40:00.143562 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="047fb828-0158-4d01-b9eb-531e3399a2ff" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.143582 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="047fb828-0158-4d01-b9eb-531e3399a2ff" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.143805 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="047fb828-0158-4d01-b9eb-531e3399a2ff" containerName="oc" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.144403 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.146393 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.146828 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.153676 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.158467 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-pxflm"] Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.275921 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rlmf\" (UniqueName: \"kubernetes.io/projected/43fa2af6-cad9-4228-b5cf-d017b9b55c23-kube-api-access-9rlmf\") pod \"auto-csr-approver-29566780-pxflm\" (UID: \"43fa2af6-cad9-4228-b5cf-d017b9b55c23\") " pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.377934 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9rlmf\" (UniqueName: \"kubernetes.io/projected/43fa2af6-cad9-4228-b5cf-d017b9b55c23-kube-api-access-9rlmf\") pod \"auto-csr-approver-29566780-pxflm\" (UID: \"43fa2af6-cad9-4228-b5cf-d017b9b55c23\") " pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.398734 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9rlmf\" (UniqueName: \"kubernetes.io/projected/43fa2af6-cad9-4228-b5cf-d017b9b55c23-kube-api-access-9rlmf\") pod \"auto-csr-approver-29566780-pxflm\" (UID: \"43fa2af6-cad9-4228-b5cf-d017b9b55c23\") " pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.474549 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.893270 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-pxflm"] Mar 20 11:40:00 crc kubenswrapper[4772]: I0320 11:40:00.904627 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:40:01 crc kubenswrapper[4772]: I0320 11:40:01.732193 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-pxflm" event={"ID":"43fa2af6-cad9-4228-b5cf-d017b9b55c23","Type":"ContainerStarted","Data":"6c0853790c7bd03c753f9fb44424f581778e745ae85dd25cfec6b87d70c3fb69"} Mar 20 11:40:02 crc kubenswrapper[4772]: I0320 11:40:02.739828 4772 generic.go:334] "Generic (PLEG): container finished" podID="43fa2af6-cad9-4228-b5cf-d017b9b55c23" containerID="79d73e80e31fe63222b9ca1af50bc59245f704d91c8e4c923149cc4e98215a95" exitCode=0 Mar 20 11:40:02 crc kubenswrapper[4772]: I0320 11:40:02.739960 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-pxflm" event={"ID":"43fa2af6-cad9-4228-b5cf-d017b9b55c23","Type":"ContainerDied","Data":"79d73e80e31fe63222b9ca1af50bc59245f704d91c8e4c923149cc4e98215a95"} Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.023472 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.133431 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9rlmf\" (UniqueName: \"kubernetes.io/projected/43fa2af6-cad9-4228-b5cf-d017b9b55c23-kube-api-access-9rlmf\") pod \"43fa2af6-cad9-4228-b5cf-d017b9b55c23\" (UID: \"43fa2af6-cad9-4228-b5cf-d017b9b55c23\") " Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.141595 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43fa2af6-cad9-4228-b5cf-d017b9b55c23-kube-api-access-9rlmf" (OuterVolumeSpecName: "kube-api-access-9rlmf") pod "43fa2af6-cad9-4228-b5cf-d017b9b55c23" (UID: "43fa2af6-cad9-4228-b5cf-d017b9b55c23"). InnerVolumeSpecName "kube-api-access-9rlmf". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.235326 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9rlmf\" (UniqueName: \"kubernetes.io/projected/43fa2af6-cad9-4228-b5cf-d017b9b55c23-kube-api-access-9rlmf\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.757290 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566780-pxflm" event={"ID":"43fa2af6-cad9-4228-b5cf-d017b9b55c23","Type":"ContainerDied","Data":"6c0853790c7bd03c753f9fb44424f581778e745ae85dd25cfec6b87d70c3fb69"} Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.757548 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6c0853790c7bd03c753f9fb44424f581778e745ae85dd25cfec6b87d70c3fb69" Mar 20 11:40:04 crc kubenswrapper[4772]: I0320 11:40:04.757359 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566780-pxflm" Mar 20 11:40:05 crc kubenswrapper[4772]: I0320 11:40:05.090129 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-z6xh6"] Mar 20 11:40:05 crc kubenswrapper[4772]: I0320 11:40:05.095905 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566774-z6xh6"] Mar 20 11:40:06 crc kubenswrapper[4772]: I0320 11:40:06.651715 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="aec50b8d-6d2a-4721-8812-b878368cc151" path="/var/lib/kubelet/pods/aec50b8d-6d2a-4721-8812-b878368cc151/volumes" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.829239 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-7h2xq"] Mar 20 11:40:10 crc kubenswrapper[4772]: E0320 11:40:10.829884 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43fa2af6-cad9-4228-b5cf-d017b9b55c23" containerName="oc" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.829898 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="43fa2af6-cad9-4228-b5cf-d017b9b55c23" containerName="oc" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.830090 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="43fa2af6-cad9-4228-b5cf-d017b9b55c23" containerName="oc" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.831184 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.842953 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h2xq"] Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.934654 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-utilities\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.934729 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xnt8x\" (UniqueName: \"kubernetes.io/projected/6e94d477-eac4-4884-8111-195b2aacf32a-kube-api-access-xnt8x\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:10 crc kubenswrapper[4772]: I0320 11:40:10.934784 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-catalog-content\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.035934 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xnt8x\" (UniqueName: \"kubernetes.io/projected/6e94d477-eac4-4884-8111-195b2aacf32a-kube-api-access-xnt8x\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.036023 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-catalog-content\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.036120 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-utilities\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.037545 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-utilities\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.037673 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-catalog-content\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.061271 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xnt8x\" (UniqueName: \"kubernetes.io/projected/6e94d477-eac4-4884-8111-195b2aacf32a-kube-api-access-xnt8x\") pod \"redhat-operators-7h2xq\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.157940 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.659497 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-7h2xq"] Mar 20 11:40:11 crc kubenswrapper[4772]: I0320 11:40:11.822585 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerStarted","Data":"b6cbb5d146fd42b7819607065e89deeef354fa8837072cef2fa53714e85b3579"} Mar 20 11:40:12 crc kubenswrapper[4772]: I0320 11:40:12.834878 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e94d477-eac4-4884-8111-195b2aacf32a" containerID="805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee" exitCode=0 Mar 20 11:40:12 crc kubenswrapper[4772]: I0320 11:40:12.835012 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerDied","Data":"805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee"} Mar 20 11:40:13 crc kubenswrapper[4772]: I0320 11:40:13.843874 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerStarted","Data":"16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0"} Mar 20 11:40:14 crc kubenswrapper[4772]: I0320 11:40:14.853270 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e94d477-eac4-4884-8111-195b2aacf32a" containerID="16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0" exitCode=0 Mar 20 11:40:14 crc kubenswrapper[4772]: I0320 11:40:14.853339 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerDied","Data":"16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0"} Mar 20 11:40:16 crc kubenswrapper[4772]: I0320 11:40:16.870510 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerStarted","Data":"f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c"} Mar 20 11:40:16 crc kubenswrapper[4772]: I0320 11:40:16.891356 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-7h2xq" podStartSLOduration=3.365987473 podStartE2EDuration="6.891337324s" podCreationTimestamp="2026-03-20 11:40:10 +0000 UTC" firstStartedPulling="2026-03-20 11:40:12.837738192 +0000 UTC m=+2698.928704677" lastFinishedPulling="2026-03-20 11:40:16.363088043 +0000 UTC m=+2702.454054528" observedRunningTime="2026-03-20 11:40:16.890715507 +0000 UTC m=+2702.981681992" watchObservedRunningTime="2026-03-20 11:40:16.891337324 +0000 UTC m=+2702.982303809" Mar 20 11:40:21 crc kubenswrapper[4772]: I0320 11:40:21.158596 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:21 crc kubenswrapper[4772]: I0320 11:40:21.159314 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:22 crc kubenswrapper[4772]: I0320 11:40:22.207497 4772 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-7h2xq" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="registry-server" probeResult="failure" output=< Mar 20 11:40:22 crc kubenswrapper[4772]: timeout: failed to connect service ":50051" within 1s Mar 20 11:40:22 crc kubenswrapper[4772]: > Mar 20 11:40:31 crc kubenswrapper[4772]: I0320 11:40:31.205491 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:31 crc kubenswrapper[4772]: I0320 11:40:31.249111 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:31 crc kubenswrapper[4772]: I0320 11:40:31.456764 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h2xq"] Mar 20 11:40:32 crc kubenswrapper[4772]: I0320 11:40:32.984352 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-7h2xq" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="registry-server" containerID="cri-o://f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c" gracePeriod=2 Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.363475 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.489528 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xnt8x\" (UniqueName: \"kubernetes.io/projected/6e94d477-eac4-4884-8111-195b2aacf32a-kube-api-access-xnt8x\") pod \"6e94d477-eac4-4884-8111-195b2aacf32a\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.489672 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-utilities\") pod \"6e94d477-eac4-4884-8111-195b2aacf32a\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.489800 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-catalog-content\") pod \"6e94d477-eac4-4884-8111-195b2aacf32a\" (UID: \"6e94d477-eac4-4884-8111-195b2aacf32a\") " Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.490775 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-utilities" (OuterVolumeSpecName: "utilities") pod "6e94d477-eac4-4884-8111-195b2aacf32a" (UID: "6e94d477-eac4-4884-8111-195b2aacf32a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.497618 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e94d477-eac4-4884-8111-195b2aacf32a-kube-api-access-xnt8x" (OuterVolumeSpecName: "kube-api-access-xnt8x") pod "6e94d477-eac4-4884-8111-195b2aacf32a" (UID: "6e94d477-eac4-4884-8111-195b2aacf32a"). InnerVolumeSpecName "kube-api-access-xnt8x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.591688 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.591728 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xnt8x\" (UniqueName: \"kubernetes.io/projected/6e94d477-eac4-4884-8111-195b2aacf32a-kube-api-access-xnt8x\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.625924 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "6e94d477-eac4-4884-8111-195b2aacf32a" (UID: "6e94d477-eac4-4884-8111-195b2aacf32a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.692953 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/6e94d477-eac4-4884-8111-195b2aacf32a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.993577 4772 generic.go:334] "Generic (PLEG): container finished" podID="6e94d477-eac4-4884-8111-195b2aacf32a" containerID="f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c" exitCode=0 Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.993618 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerDied","Data":"f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c"} Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.993651 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-7h2xq" Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.993677 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-7h2xq" event={"ID":"6e94d477-eac4-4884-8111-195b2aacf32a","Type":"ContainerDied","Data":"b6cbb5d146fd42b7819607065e89deeef354fa8837072cef2fa53714e85b3579"} Mar 20 11:40:33 crc kubenswrapper[4772]: I0320 11:40:33.993695 4772 scope.go:117] "RemoveContainer" containerID="f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.015658 4772 scope.go:117] "RemoveContainer" containerID="16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.043288 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-7h2xq"] Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.043689 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-7h2xq"] Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.045152 4772 scope.go:117] "RemoveContainer" containerID="805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.070096 4772 scope.go:117] "RemoveContainer" containerID="f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c" Mar 20 11:40:34 crc kubenswrapper[4772]: E0320 11:40:34.071098 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c\": container with ID starting with f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c not found: ID does not exist" containerID="f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.071132 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c"} err="failed to get container status \"f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c\": rpc error: code = NotFound desc = could not find container \"f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c\": container with ID starting with f31aef4eab41f745b57995d12cfd1fc8bf88c0d0938e94375ec116a1fee4169c not found: ID does not exist" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.071154 4772 scope.go:117] "RemoveContainer" containerID="16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0" Mar 20 11:40:34 crc kubenswrapper[4772]: E0320 11:40:34.072138 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0\": container with ID starting with 16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0 not found: ID does not exist" containerID="16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.072165 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0"} err="failed to get container status \"16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0\": rpc error: code = NotFound desc = could not find container \"16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0\": container with ID starting with 16fea346f9ee06ad281e9ba2165cc69560b491a131b88efe54e5fbdc3ac044e0 not found: ID does not exist" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.072184 4772 scope.go:117] "RemoveContainer" containerID="805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee" Mar 20 11:40:34 crc kubenswrapper[4772]: E0320 11:40:34.072656 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee\": container with ID starting with 805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee not found: ID does not exist" containerID="805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.072685 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee"} err="failed to get container status \"805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee\": rpc error: code = NotFound desc = could not find container \"805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee\": container with ID starting with 805f71d2b2b56e996a9b4a14ef2233b1661046d41c028013d0c7906bef3293ee not found: ID does not exist" Mar 20 11:40:34 crc kubenswrapper[4772]: I0320 11:40:34.649770 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" path="/var/lib/kubelet/pods/6e94d477-eac4-4884-8111-195b2aacf32a/volumes" Mar 20 11:40:40 crc kubenswrapper[4772]: I0320 11:40:40.404505 4772 scope.go:117] "RemoveContainer" containerID="76e0fa81fbd8b238cac7afc60648b6a1c3d3d0d7bb5d96d50d72efcb02e8fe79" Mar 20 11:41:09 crc kubenswrapper[4772]: I0320 11:41:09.564007 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:41:09 crc kubenswrapper[4772]: I0320 11:41:09.564634 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:41:39 crc kubenswrapper[4772]: I0320 11:41:39.564636 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:41:39 crc kubenswrapper[4772]: I0320 11:41:39.565376 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.139699 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566782-ck5fr"] Mar 20 11:42:00 crc kubenswrapper[4772]: E0320 11:42:00.140576 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="extract-content" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.140592 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="extract-content" Mar 20 11:42:00 crc kubenswrapper[4772]: E0320 11:42:00.140605 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="extract-utilities" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.140611 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="extract-utilities" Mar 20 11:42:00 crc kubenswrapper[4772]: E0320 11:42:00.140628 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.140636 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.140762 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e94d477-eac4-4884-8111-195b2aacf32a" containerName="registry-server" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.141227 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.142925 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.143029 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.143229 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.149224 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-ck5fr"] Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.247617 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gqh9x\" (UniqueName: \"kubernetes.io/projected/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2-kube-api-access-gqh9x\") pod \"auto-csr-approver-29566782-ck5fr\" (UID: \"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2\") " pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.349461 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gqh9x\" (UniqueName: \"kubernetes.io/projected/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2-kube-api-access-gqh9x\") pod \"auto-csr-approver-29566782-ck5fr\" (UID: \"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2\") " pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.379255 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gqh9x\" (UniqueName: \"kubernetes.io/projected/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2-kube-api-access-gqh9x\") pod \"auto-csr-approver-29566782-ck5fr\" (UID: \"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2\") " pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.461407 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:00 crc kubenswrapper[4772]: I0320 11:42:00.856700 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-ck5fr"] Mar 20 11:42:01 crc kubenswrapper[4772]: I0320 11:42:01.616054 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" event={"ID":"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2","Type":"ContainerStarted","Data":"9e3d606261b0810f8b75c72aa0f3467d7f355128b185ddcb4761d90cab2901b6"} Mar 20 11:42:02 crc kubenswrapper[4772]: I0320 11:42:02.624727 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" event={"ID":"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2","Type":"ContainerStarted","Data":"69e0f11b865d7fca2ec3d1de0b747929b49ab7b4928af53fb02555a98c4a8a36"} Mar 20 11:42:02 crc kubenswrapper[4772]: I0320 11:42:02.642527 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" podStartSLOduration=1.389589758 podStartE2EDuration="2.642502711s" podCreationTimestamp="2026-03-20 11:42:00 +0000 UTC" firstStartedPulling="2026-03-20 11:42:00.864332543 +0000 UTC m=+2806.955299028" lastFinishedPulling="2026-03-20 11:42:02.117245496 +0000 UTC m=+2808.208211981" observedRunningTime="2026-03-20 11:42:02.640758173 +0000 UTC m=+2808.731724678" watchObservedRunningTime="2026-03-20 11:42:02.642502711 +0000 UTC m=+2808.733469196" Mar 20 11:42:03 crc kubenswrapper[4772]: I0320 11:42:03.633285 4772 generic.go:334] "Generic (PLEG): container finished" podID="e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2" containerID="69e0f11b865d7fca2ec3d1de0b747929b49ab7b4928af53fb02555a98c4a8a36" exitCode=0 Mar 20 11:42:03 crc kubenswrapper[4772]: I0320 11:42:03.633340 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" event={"ID":"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2","Type":"ContainerDied","Data":"69e0f11b865d7fca2ec3d1de0b747929b49ab7b4928af53fb02555a98c4a8a36"} Mar 20 11:42:04 crc kubenswrapper[4772]: I0320 11:42:04.944745 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.013264 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gqh9x\" (UniqueName: \"kubernetes.io/projected/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2-kube-api-access-gqh9x\") pod \"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2\" (UID: \"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2\") " Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.023016 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2-kube-api-access-gqh9x" (OuterVolumeSpecName: "kube-api-access-gqh9x") pod "e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2" (UID: "e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2"). InnerVolumeSpecName "kube-api-access-gqh9x". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.115474 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gqh9x\" (UniqueName: \"kubernetes.io/projected/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2-kube-api-access-gqh9x\") on node \"crc\" DevicePath \"\"" Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.648158 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" event={"ID":"e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2","Type":"ContainerDied","Data":"9e3d606261b0810f8b75c72aa0f3467d7f355128b185ddcb4761d90cab2901b6"} Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.648205 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="9e3d606261b0810f8b75c72aa0f3467d7f355128b185ddcb4761d90cab2901b6" Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.648212 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566782-ck5fr" Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.701786 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-mv89f"] Mar 20 11:42:05 crc kubenswrapper[4772]: I0320 11:42:05.707224 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566776-mv89f"] Mar 20 11:42:06 crc kubenswrapper[4772]: I0320 11:42:06.650797 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49979ceb-586d-484a-96cf-989cf2703da2" path="/var/lib/kubelet/pods/49979ceb-586d-484a-96cf-989cf2703da2/volumes" Mar 20 11:42:09 crc kubenswrapper[4772]: I0320 11:42:09.565333 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:42:09 crc kubenswrapper[4772]: I0320 11:42:09.566050 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:42:09 crc kubenswrapper[4772]: I0320 11:42:09.566136 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:42:09 crc kubenswrapper[4772]: I0320 11:42:09.567277 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"d1c815217e6d78011cef6fae18a43310abbde6caa4920ff34c00cf94580234b9"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:42:09 crc kubenswrapper[4772]: I0320 11:42:09.567352 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://d1c815217e6d78011cef6fae18a43310abbde6caa4920ff34c00cf94580234b9" gracePeriod=600 Mar 20 11:42:10 crc kubenswrapper[4772]: I0320 11:42:10.682613 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="d1c815217e6d78011cef6fae18a43310abbde6caa4920ff34c00cf94580234b9" exitCode=0 Mar 20 11:42:10 crc kubenswrapper[4772]: I0320 11:42:10.682685 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"d1c815217e6d78011cef6fae18a43310abbde6caa4920ff34c00cf94580234b9"} Mar 20 11:42:10 crc kubenswrapper[4772]: I0320 11:42:10.683063 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a"} Mar 20 11:42:10 crc kubenswrapper[4772]: I0320 11:42:10.683081 4772 scope.go:117] "RemoveContainer" containerID="f3a392fbc9a25ebeeaa936df89e75bf911425ddb233305195fc9fbc24adab7af" Mar 20 11:42:40 crc kubenswrapper[4772]: I0320 11:42:40.491160 4772 scope.go:117] "RemoveContainer" containerID="97a2d83cf1b1364d878a5ecc4d9ce90bf7e072df8da68221e2a60268f8f594b7" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.562665 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-dwrf2"] Mar 20 11:43:29 crc kubenswrapper[4772]: E0320 11:43:29.563571 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2" containerName="oc" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.563590 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2" containerName="oc" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.563773 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2" containerName="oc" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.564994 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.575148 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwrf2"] Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.748358 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4vw8\" (UniqueName: \"kubernetes.io/projected/fecd6676-c293-4cda-90c6-4b91a2a0f449-kube-api-access-f4vw8\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.748575 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-utilities\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.748651 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-catalog-content\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.755587 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-spzxd"] Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.756855 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.778924 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spzxd"] Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.850287 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-utilities\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.850602 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-catalog-content\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.850712 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f4vw8\" (UniqueName: \"kubernetes.io/projected/fecd6676-c293-4cda-90c6-4b91a2a0f449-kube-api-access-f4vw8\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.850792 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-utilities\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.851077 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-catalog-content\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.882127 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4vw8\" (UniqueName: \"kubernetes.io/projected/fecd6676-c293-4cda-90c6-4b91a2a0f449-kube-api-access-f4vw8\") pod \"certified-operators-dwrf2\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.899319 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.952094 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-utilities\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.952190 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-catalog-content\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:29 crc kubenswrapper[4772]: I0320 11:43:29.952256 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c75g5\" (UniqueName: \"kubernetes.io/projected/785561f8-4a87-4e4e-8ba7-289f0b47e538-kube-api-access-c75g5\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.054300 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-catalog-content\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.054866 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-catalog-content\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.054913 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-c75g5\" (UniqueName: \"kubernetes.io/projected/785561f8-4a87-4e4e-8ba7-289f0b47e538-kube-api-access-c75g5\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.054978 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-utilities\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.055473 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-utilities\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.082060 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-c75g5\" (UniqueName: \"kubernetes.io/projected/785561f8-4a87-4e4e-8ba7-289f0b47e538-kube-api-access-c75g5\") pod \"community-operators-spzxd\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.087159 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.419407 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-dwrf2"] Mar 20 11:43:30 crc kubenswrapper[4772]: I0320 11:43:30.601293 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-spzxd"] Mar 20 11:43:30 crc kubenswrapper[4772]: W0320 11:43:30.611654 4772 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod785561f8_4a87_4e4e_8ba7_289f0b47e538.slice/crio-6e2ea7a0dc3f558c3eb77752c9587741b49f79b57d3d555274319306dbc270ca WatchSource:0}: Error finding container 6e2ea7a0dc3f558c3eb77752c9587741b49f79b57d3d555274319306dbc270ca: Status 404 returned error can't find the container with id 6e2ea7a0dc3f558c3eb77752c9587741b49f79b57d3d555274319306dbc270ca Mar 20 11:43:31 crc kubenswrapper[4772]: I0320 11:43:31.244632 4772 generic.go:334] "Generic (PLEG): container finished" podID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerID="a7987fad04d6697665f332a1282b7310bfc9d1dd8767820a6410e58a25a425ed" exitCode=0 Mar 20 11:43:31 crc kubenswrapper[4772]: I0320 11:43:31.244773 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spzxd" event={"ID":"785561f8-4a87-4e4e-8ba7-289f0b47e538","Type":"ContainerDied","Data":"a7987fad04d6697665f332a1282b7310bfc9d1dd8767820a6410e58a25a425ed"} Mar 20 11:43:31 crc kubenswrapper[4772]: I0320 11:43:31.244807 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spzxd" event={"ID":"785561f8-4a87-4e4e-8ba7-289f0b47e538","Type":"ContainerStarted","Data":"6e2ea7a0dc3f558c3eb77752c9587741b49f79b57d3d555274319306dbc270ca"} Mar 20 11:43:31 crc kubenswrapper[4772]: I0320 11:43:31.247426 4772 generic.go:334] "Generic (PLEG): container finished" podID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerID="d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb" exitCode=0 Mar 20 11:43:31 crc kubenswrapper[4772]: I0320 11:43:31.247484 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwrf2" event={"ID":"fecd6676-c293-4cda-90c6-4b91a2a0f449","Type":"ContainerDied","Data":"d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb"} Mar 20 11:43:31 crc kubenswrapper[4772]: I0320 11:43:31.247514 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwrf2" event={"ID":"fecd6676-c293-4cda-90c6-4b91a2a0f449","Type":"ContainerStarted","Data":"9717fb213aa3dc5677aed3098cd650859883326d1aa7e75b48cb6162d1f47093"} Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.756783 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-qssd7"] Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.759200 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.767091 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qssd7"] Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.804207 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z8c9h\" (UniqueName: \"kubernetes.io/projected/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-kube-api-access-z8c9h\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.804276 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-catalog-content\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.804309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-utilities\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.905820 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z8c9h\" (UniqueName: \"kubernetes.io/projected/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-kube-api-access-z8c9h\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.905916 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-catalog-content\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.905941 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-utilities\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.906579 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-utilities\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.906600 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-catalog-content\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:32 crc kubenswrapper[4772]: I0320 11:43:32.925781 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z8c9h\" (UniqueName: \"kubernetes.io/projected/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-kube-api-access-z8c9h\") pod \"redhat-marketplace-qssd7\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:33 crc kubenswrapper[4772]: I0320 11:43:33.076416 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:33 crc kubenswrapper[4772]: I0320 11:43:33.279146 4772 generic.go:334] "Generic (PLEG): container finished" podID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerID="8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7" exitCode=0 Mar 20 11:43:33 crc kubenswrapper[4772]: I0320 11:43:33.279388 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwrf2" event={"ID":"fecd6676-c293-4cda-90c6-4b91a2a0f449","Type":"ContainerDied","Data":"8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7"} Mar 20 11:43:33 crc kubenswrapper[4772]: I0320 11:43:33.283366 4772 generic.go:334] "Generic (PLEG): container finished" podID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerID="081d13b7495be2aabf01985bfcceba573833877fa717cc74dfad290ace2183f8" exitCode=0 Mar 20 11:43:33 crc kubenswrapper[4772]: I0320 11:43:33.283411 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spzxd" event={"ID":"785561f8-4a87-4e4e-8ba7-289f0b47e538","Type":"ContainerDied","Data":"081d13b7495be2aabf01985bfcceba573833877fa717cc74dfad290ace2183f8"} Mar 20 11:43:33 crc kubenswrapper[4772]: I0320 11:43:33.513317 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-qssd7"] Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.292370 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwrf2" event={"ID":"fecd6676-c293-4cda-90c6-4b91a2a0f449","Type":"ContainerStarted","Data":"7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2"} Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.293747 4772 generic.go:334] "Generic (PLEG): container finished" podID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerID="ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358" exitCode=0 Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.293802 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerDied","Data":"ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358"} Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.293825 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerStarted","Data":"2b1bbb9f90e7a3f7418a3ce9d09e001cc760d48d457ebe3e9f7ba4703dc3d976"} Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.298651 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spzxd" event={"ID":"785561f8-4a87-4e4e-8ba7-289f0b47e538","Type":"ContainerStarted","Data":"191ce4a7833a95c3f35eb6d6c61c61671da672284c3e36701354264880d9fb74"} Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.328155 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-dwrf2" podStartSLOduration=2.714204643 podStartE2EDuration="5.328131217s" podCreationTimestamp="2026-03-20 11:43:29 +0000 UTC" firstStartedPulling="2026-03-20 11:43:31.250663937 +0000 UTC m=+2897.341630422" lastFinishedPulling="2026-03-20 11:43:33.864590511 +0000 UTC m=+2899.955556996" observedRunningTime="2026-03-20 11:43:34.321138146 +0000 UTC m=+2900.412104651" watchObservedRunningTime="2026-03-20 11:43:34.328131217 +0000 UTC m=+2900.419097702" Mar 20 11:43:34 crc kubenswrapper[4772]: I0320 11:43:34.363037 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-spzxd" podStartSLOduration=2.682078795 podStartE2EDuration="5.362991711s" podCreationTimestamp="2026-03-20 11:43:29 +0000 UTC" firstStartedPulling="2026-03-20 11:43:31.247735917 +0000 UTC m=+2897.338702402" lastFinishedPulling="2026-03-20 11:43:33.928648833 +0000 UTC m=+2900.019615318" observedRunningTime="2026-03-20 11:43:34.362697403 +0000 UTC m=+2900.453663898" watchObservedRunningTime="2026-03-20 11:43:34.362991711 +0000 UTC m=+2900.453958206" Mar 20 11:43:35 crc kubenswrapper[4772]: I0320 11:43:35.306368 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerStarted","Data":"f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c"} Mar 20 11:43:36 crc kubenswrapper[4772]: I0320 11:43:36.314452 4772 generic.go:334] "Generic (PLEG): container finished" podID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerID="f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c" exitCode=0 Mar 20 11:43:36 crc kubenswrapper[4772]: I0320 11:43:36.314537 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerDied","Data":"f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c"} Mar 20 11:43:38 crc kubenswrapper[4772]: I0320 11:43:38.334080 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerStarted","Data":"4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d"} Mar 20 11:43:38 crc kubenswrapper[4772]: I0320 11:43:38.364235 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-qssd7" podStartSLOduration=3.061484523 podStartE2EDuration="6.364198143s" podCreationTimestamp="2026-03-20 11:43:32 +0000 UTC" firstStartedPulling="2026-03-20 11:43:34.294935509 +0000 UTC m=+2900.385901994" lastFinishedPulling="2026-03-20 11:43:37.597649129 +0000 UTC m=+2903.688615614" observedRunningTime="2026-03-20 11:43:38.356021329 +0000 UTC m=+2904.446987824" watchObservedRunningTime="2026-03-20 11:43:38.364198143 +0000 UTC m=+2904.455164618" Mar 20 11:43:39 crc kubenswrapper[4772]: I0320 11:43:39.899654 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:39 crc kubenswrapper[4772]: I0320 11:43:39.899713 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:39 crc kubenswrapper[4772]: I0320 11:43:39.940706 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:40 crc kubenswrapper[4772]: I0320 11:43:40.087887 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:40 crc kubenswrapper[4772]: I0320 11:43:40.088277 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:40 crc kubenswrapper[4772]: I0320 11:43:40.136150 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:40 crc kubenswrapper[4772]: I0320 11:43:40.393433 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:40 crc kubenswrapper[4772]: I0320 11:43:40.395576 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:42 crc kubenswrapper[4772]: I0320 11:43:42.351221 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dwrf2"] Mar 20 11:43:42 crc kubenswrapper[4772]: I0320 11:43:42.365220 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-dwrf2" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="registry-server" containerID="cri-o://7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2" gracePeriod=2 Mar 20 11:43:42 crc kubenswrapper[4772]: I0320 11:43:42.550794 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-spzxd"] Mar 20 11:43:42 crc kubenswrapper[4772]: I0320 11:43:42.551419 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-spzxd" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="registry-server" containerID="cri-o://191ce4a7833a95c3f35eb6d6c61c61671da672284c3e36701354264880d9fb74" gracePeriod=2 Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.077335 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.077380 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.127917 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.303206 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.375991 4772 generic.go:334] "Generic (PLEG): container finished" podID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerID="191ce4a7833a95c3f35eb6d6c61c61671da672284c3e36701354264880d9fb74" exitCode=0 Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.376093 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spzxd" event={"ID":"785561f8-4a87-4e4e-8ba7-289f0b47e538","Type":"ContainerDied","Data":"191ce4a7833a95c3f35eb6d6c61c61671da672284c3e36701354264880d9fb74"} Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.378675 4772 generic.go:334] "Generic (PLEG): container finished" podID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerID="7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2" exitCode=0 Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.378996 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-dwrf2" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.379013 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwrf2" event={"ID":"fecd6676-c293-4cda-90c6-4b91a2a0f449","Type":"ContainerDied","Data":"7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2"} Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.379194 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-dwrf2" event={"ID":"fecd6676-c293-4cda-90c6-4b91a2a0f449","Type":"ContainerDied","Data":"9717fb213aa3dc5677aed3098cd650859883326d1aa7e75b48cb6162d1f47093"} Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.379223 4772 scope.go:117] "RemoveContainer" containerID="7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.389693 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f4vw8\" (UniqueName: \"kubernetes.io/projected/fecd6676-c293-4cda-90c6-4b91a2a0f449-kube-api-access-f4vw8\") pod \"fecd6676-c293-4cda-90c6-4b91a2a0f449\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.389796 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-catalog-content\") pod \"fecd6676-c293-4cda-90c6-4b91a2a0f449\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.389979 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-utilities\") pod \"fecd6676-c293-4cda-90c6-4b91a2a0f449\" (UID: \"fecd6676-c293-4cda-90c6-4b91a2a0f449\") " Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.391934 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-utilities" (OuterVolumeSpecName: "utilities") pod "fecd6676-c293-4cda-90c6-4b91a2a0f449" (UID: "fecd6676-c293-4cda-90c6-4b91a2a0f449"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.398368 4772 scope.go:117] "RemoveContainer" containerID="8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.413414 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fecd6676-c293-4cda-90c6-4b91a2a0f449-kube-api-access-f4vw8" (OuterVolumeSpecName: "kube-api-access-f4vw8") pod "fecd6676-c293-4cda-90c6-4b91a2a0f449" (UID: "fecd6676-c293-4cda-90c6-4b91a2a0f449"). InnerVolumeSpecName "kube-api-access-f4vw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.431991 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.464218 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "fecd6676-c293-4cda-90c6-4b91a2a0f449" (UID: "fecd6676-c293-4cda-90c6-4b91a2a0f449"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.477937 4772 scope.go:117] "RemoveContainer" containerID="d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.493919 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.493981 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f4vw8\" (UniqueName: \"kubernetes.io/projected/fecd6676-c293-4cda-90c6-4b91a2a0f449-kube-api-access-f4vw8\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.493998 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/fecd6676-c293-4cda-90c6-4b91a2a0f449-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.552030 4772 scope.go:117] "RemoveContainer" containerID="7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2" Mar 20 11:43:43 crc kubenswrapper[4772]: E0320 11:43:43.556542 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2\": container with ID starting with 7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2 not found: ID does not exist" containerID="7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.556582 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2"} err="failed to get container status \"7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2\": rpc error: code = NotFound desc = could not find container \"7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2\": container with ID starting with 7e131192304acc208ecd6ba16807be9a339edc3403df14af82473f366772b9d2 not found: ID does not exist" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.556608 4772 scope.go:117] "RemoveContainer" containerID="8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7" Mar 20 11:43:43 crc kubenswrapper[4772]: E0320 11:43:43.560051 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7\": container with ID starting with 8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7 not found: ID does not exist" containerID="8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.560110 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7"} err="failed to get container status \"8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7\": rpc error: code = NotFound desc = could not find container \"8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7\": container with ID starting with 8c5085ccbbfbd8113ec46cc75c292011f5ffe04e3ac0215dc772fe2199abc2c7 not found: ID does not exist" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.560140 4772 scope.go:117] "RemoveContainer" containerID="d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb" Mar 20 11:43:43 crc kubenswrapper[4772]: E0320 11:43:43.561305 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb\": container with ID starting with d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb not found: ID does not exist" containerID="d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.561342 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb"} err="failed to get container status \"d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb\": rpc error: code = NotFound desc = could not find container \"d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb\": container with ID starting with d9f792a08b423d8a9b57a8b1d679381b56e126ce74eb1d30d17b0059fc06b9bb not found: ID does not exist" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.634710 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.702214 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-catalog-content\") pod \"785561f8-4a87-4e4e-8ba7-289f0b47e538\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.702461 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-c75g5\" (UniqueName: \"kubernetes.io/projected/785561f8-4a87-4e4e-8ba7-289f0b47e538-kube-api-access-c75g5\") pod \"785561f8-4a87-4e4e-8ba7-289f0b47e538\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.705820 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-utilities\") pod \"785561f8-4a87-4e4e-8ba7-289f0b47e538\" (UID: \"785561f8-4a87-4e4e-8ba7-289f0b47e538\") " Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.708364 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-utilities" (OuterVolumeSpecName: "utilities") pod "785561f8-4a87-4e4e-8ba7-289f0b47e538" (UID: "785561f8-4a87-4e4e-8ba7-289f0b47e538"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.718386 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-dwrf2"] Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.724928 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-dwrf2"] Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.726993 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/785561f8-4a87-4e4e-8ba7-289f0b47e538-kube-api-access-c75g5" (OuterVolumeSpecName: "kube-api-access-c75g5") pod "785561f8-4a87-4e4e-8ba7-289f0b47e538" (UID: "785561f8-4a87-4e4e-8ba7-289f0b47e538"). InnerVolumeSpecName "kube-api-access-c75g5". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.809176 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:43 crc kubenswrapper[4772]: I0320 11:43:43.809217 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-c75g5\" (UniqueName: \"kubernetes.io/projected/785561f8-4a87-4e4e-8ba7-289f0b47e538-kube-api-access-c75g5\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.387005 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-spzxd" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.386966 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-spzxd" event={"ID":"785561f8-4a87-4e4e-8ba7-289f0b47e538","Type":"ContainerDied","Data":"6e2ea7a0dc3f558c3eb77752c9587741b49f79b57d3d555274319306dbc270ca"} Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.387447 4772 scope.go:117] "RemoveContainer" containerID="191ce4a7833a95c3f35eb6d6c61c61671da672284c3e36701354264880d9fb74" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.402288 4772 scope.go:117] "RemoveContainer" containerID="081d13b7495be2aabf01985bfcceba573833877fa717cc74dfad290ace2183f8" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.418253 4772 scope.go:117] "RemoveContainer" containerID="a7987fad04d6697665f332a1282b7310bfc9d1dd8767820a6410e58a25a425ed" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.652442 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" path="/var/lib/kubelet/pods/fecd6676-c293-4cda-90c6-4b91a2a0f449/volumes" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.856038 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "785561f8-4a87-4e4e-8ba7-289f0b47e538" (UID: "785561f8-4a87-4e4e-8ba7-289f0b47e538"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:44 crc kubenswrapper[4772]: I0320 11:43:44.925286 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/785561f8-4a87-4e4e-8ba7-289f0b47e538-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:45 crc kubenswrapper[4772]: I0320 11:43:45.020137 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-spzxd"] Mar 20 11:43:45 crc kubenswrapper[4772]: I0320 11:43:45.026667 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-spzxd"] Mar 20 11:43:46 crc kubenswrapper[4772]: I0320 11:43:46.652792 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" path="/var/lib/kubelet/pods/785561f8-4a87-4e4e-8ba7-289f0b47e538/volumes" Mar 20 11:43:46 crc kubenswrapper[4772]: I0320 11:43:46.746784 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qssd7"] Mar 20 11:43:46 crc kubenswrapper[4772]: I0320 11:43:46.747070 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-qssd7" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="registry-server" containerID="cri-o://4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d" gracePeriod=2 Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.105227 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.170344 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-catalog-content\") pod \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.170483 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z8c9h\" (UniqueName: \"kubernetes.io/projected/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-kube-api-access-z8c9h\") pod \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.170549 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-utilities\") pod \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\" (UID: \"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a\") " Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.171622 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-utilities" (OuterVolumeSpecName: "utilities") pod "82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" (UID: "82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.187114 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-kube-api-access-z8c9h" (OuterVolumeSpecName: "kube-api-access-z8c9h") pod "82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" (UID: "82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a"). InnerVolumeSpecName "kube-api-access-z8c9h". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.216593 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" (UID: "82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.272071 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.272125 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.272144 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z8c9h\" (UniqueName: \"kubernetes.io/projected/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a-kube-api-access-z8c9h\") on node \"crc\" DevicePath \"\"" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.417132 4772 generic.go:334] "Generic (PLEG): container finished" podID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerID="4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d" exitCode=0 Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.417795 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerDied","Data":"4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d"} Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.417883 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-qssd7" event={"ID":"82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a","Type":"ContainerDied","Data":"2b1bbb9f90e7a3f7418a3ce9d09e001cc760d48d457ebe3e9f7ba4703dc3d976"} Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.417913 4772 scope.go:117] "RemoveContainer" containerID="4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.418115 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-qssd7" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.437273 4772 scope.go:117] "RemoveContainer" containerID="f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.458178 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-qssd7"] Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.458577 4772 scope.go:117] "RemoveContainer" containerID="ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.469734 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-qssd7"] Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.489487 4772 scope.go:117] "RemoveContainer" containerID="4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d" Mar 20 11:43:48 crc kubenswrapper[4772]: E0320 11:43:48.490052 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d\": container with ID starting with 4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d not found: ID does not exist" containerID="4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.490089 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d"} err="failed to get container status \"4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d\": rpc error: code = NotFound desc = could not find container \"4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d\": container with ID starting with 4314b658f2e9c5175ccc68618405a4c0e76e623af4311ec06bf831369bc1a28d not found: ID does not exist" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.490119 4772 scope.go:117] "RemoveContainer" containerID="f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c" Mar 20 11:43:48 crc kubenswrapper[4772]: E0320 11:43:48.490447 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c\": container with ID starting with f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c not found: ID does not exist" containerID="f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.490480 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c"} err="failed to get container status \"f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c\": rpc error: code = NotFound desc = could not find container \"f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c\": container with ID starting with f77864b35e458a366712038d21bd9383d78312878bba1534698de593f13b908c not found: ID does not exist" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.490497 4772 scope.go:117] "RemoveContainer" containerID="ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358" Mar 20 11:43:48 crc kubenswrapper[4772]: E0320 11:43:48.490802 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358\": container with ID starting with ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358 not found: ID does not exist" containerID="ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.490866 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358"} err="failed to get container status \"ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358\": rpc error: code = NotFound desc = could not find container \"ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358\": container with ID starting with ff03723d3cc3d382eff8931ccdcc4d7e5cdebb69afefe7692379d9729007e358 not found: ID does not exist" Mar 20 11:43:48 crc kubenswrapper[4772]: I0320 11:43:48.652011 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" path="/var/lib/kubelet/pods/82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a/volumes" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.147180 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566784-xshgf"] Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148139 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148185 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148205 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148214 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148228 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148237 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148253 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148261 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148290 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148299 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148310 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148319 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148336 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148345 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="extract-utilities" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148354 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148362 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: E0320 11:44:00.148376 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148384 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="extract-content" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148573 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="fecd6676-c293-4cda-90c6-4b91a2a0f449" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148602 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="785561f8-4a87-4e4e-8ba7-289f0b47e538" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.148617 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="82fe95d7-1f7f-4e1b-a3c5-e1858e7a181a" containerName="registry-server" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.149301 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.151050 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.151348 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.151435 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.156052 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-xshgf"] Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.351586 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l65kt\" (UniqueName: \"kubernetes.io/projected/68262080-18cc-4d09-b2ef-2f7629f952ae-kube-api-access-l65kt\") pod \"auto-csr-approver-29566784-xshgf\" (UID: \"68262080-18cc-4d09-b2ef-2f7629f952ae\") " pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.453327 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l65kt\" (UniqueName: \"kubernetes.io/projected/68262080-18cc-4d09-b2ef-2f7629f952ae-kube-api-access-l65kt\") pod \"auto-csr-approver-29566784-xshgf\" (UID: \"68262080-18cc-4d09-b2ef-2f7629f952ae\") " pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.471761 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l65kt\" (UniqueName: \"kubernetes.io/projected/68262080-18cc-4d09-b2ef-2f7629f952ae-kube-api-access-l65kt\") pod \"auto-csr-approver-29566784-xshgf\" (UID: \"68262080-18cc-4d09-b2ef-2f7629f952ae\") " pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:00 crc kubenswrapper[4772]: I0320 11:44:00.768188 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:01 crc kubenswrapper[4772]: I0320 11:44:01.196708 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-xshgf"] Mar 20 11:44:01 crc kubenswrapper[4772]: I0320 11:44:01.521232 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-xshgf" event={"ID":"68262080-18cc-4d09-b2ef-2f7629f952ae","Type":"ContainerStarted","Data":"16a45f11b46be78f8359067b01b48bbc6ad38df582dde35c757ec0f0084fcb7a"} Mar 20 11:44:03 crc kubenswrapper[4772]: I0320 11:44:03.537368 4772 generic.go:334] "Generic (PLEG): container finished" podID="68262080-18cc-4d09-b2ef-2f7629f952ae" containerID="188f12a465b3dae06b4b37a58a01defaaefaa21a8d26f61b3d4349df09c58e97" exitCode=0 Mar 20 11:44:03 crc kubenswrapper[4772]: I0320 11:44:03.537489 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-xshgf" event={"ID":"68262080-18cc-4d09-b2ef-2f7629f952ae","Type":"ContainerDied","Data":"188f12a465b3dae06b4b37a58a01defaaefaa21a8d26f61b3d4349df09c58e97"} Mar 20 11:44:04 crc kubenswrapper[4772]: I0320 11:44:04.822973 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:04 crc kubenswrapper[4772]: I0320 11:44:04.918197 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l65kt\" (UniqueName: \"kubernetes.io/projected/68262080-18cc-4d09-b2ef-2f7629f952ae-kube-api-access-l65kt\") pod \"68262080-18cc-4d09-b2ef-2f7629f952ae\" (UID: \"68262080-18cc-4d09-b2ef-2f7629f952ae\") " Mar 20 11:44:04 crc kubenswrapper[4772]: I0320 11:44:04.930365 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68262080-18cc-4d09-b2ef-2f7629f952ae-kube-api-access-l65kt" (OuterVolumeSpecName: "kube-api-access-l65kt") pod "68262080-18cc-4d09-b2ef-2f7629f952ae" (UID: "68262080-18cc-4d09-b2ef-2f7629f952ae"). InnerVolumeSpecName "kube-api-access-l65kt". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:44:05 crc kubenswrapper[4772]: I0320 11:44:05.019294 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l65kt\" (UniqueName: \"kubernetes.io/projected/68262080-18cc-4d09-b2ef-2f7629f952ae-kube-api-access-l65kt\") on node \"crc\" DevicePath \"\"" Mar 20 11:44:05 crc kubenswrapper[4772]: I0320 11:44:05.553674 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566784-xshgf" event={"ID":"68262080-18cc-4d09-b2ef-2f7629f952ae","Type":"ContainerDied","Data":"16a45f11b46be78f8359067b01b48bbc6ad38df582dde35c757ec0f0084fcb7a"} Mar 20 11:44:05 crc kubenswrapper[4772]: I0320 11:44:05.553719 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="16a45f11b46be78f8359067b01b48bbc6ad38df582dde35c757ec0f0084fcb7a" Mar 20 11:44:05 crc kubenswrapper[4772]: I0320 11:44:05.553787 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566784-xshgf" Mar 20 11:44:05 crc kubenswrapper[4772]: I0320 11:44:05.891183 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-pzbwx"] Mar 20 11:44:05 crc kubenswrapper[4772]: I0320 11:44:05.897663 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566778-pzbwx"] Mar 20 11:44:06 crc kubenswrapper[4772]: I0320 11:44:06.650103 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="047fb828-0158-4d01-b9eb-531e3399a2ff" path="/var/lib/kubelet/pods/047fb828-0158-4d01-b9eb-531e3399a2ff/volumes" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.794511 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-dsdnq/must-gather-k4dgg"] Mar 20 11:44:27 crc kubenswrapper[4772]: E0320 11:44:27.795444 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68262080-18cc-4d09-b2ef-2f7629f952ae" containerName="oc" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.795461 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="68262080-18cc-4d09-b2ef-2f7629f952ae" containerName="oc" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.795626 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="68262080-18cc-4d09-b2ef-2f7629f952ae" containerName="oc" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.796350 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.798076 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dsdnq"/"kube-root-ca.crt" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.798248 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-dsdnq"/"openshift-service-ca.crt" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.860954 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dsdnq/must-gather-k4dgg"] Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.915943 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93253269-221c-4fa2-89e4-8f84eed7adfa-must-gather-output\") pod \"must-gather-k4dgg\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:27 crc kubenswrapper[4772]: I0320 11:44:27.916109 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xx99p\" (UniqueName: \"kubernetes.io/projected/93253269-221c-4fa2-89e4-8f84eed7adfa-kube-api-access-xx99p\") pod \"must-gather-k4dgg\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.017893 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xx99p\" (UniqueName: \"kubernetes.io/projected/93253269-221c-4fa2-89e4-8f84eed7adfa-kube-api-access-xx99p\") pod \"must-gather-k4dgg\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.017976 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93253269-221c-4fa2-89e4-8f84eed7adfa-must-gather-output\") pod \"must-gather-k4dgg\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.018594 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93253269-221c-4fa2-89e4-8f84eed7adfa-must-gather-output\") pod \"must-gather-k4dgg\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.037947 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xx99p\" (UniqueName: \"kubernetes.io/projected/93253269-221c-4fa2-89e4-8f84eed7adfa-kube-api-access-xx99p\") pod \"must-gather-k4dgg\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.114568 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.535640 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-dsdnq/must-gather-k4dgg"] Mar 20 11:44:28 crc kubenswrapper[4772]: I0320 11:44:28.731889 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" event={"ID":"93253269-221c-4fa2-89e4-8f84eed7adfa","Type":"ContainerStarted","Data":"006d8c43b214c14f10daf9df04c7017ceeabf80f7d55063d5037d77bbc59f651"} Mar 20 11:44:36 crc kubenswrapper[4772]: I0320 11:44:36.794929 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" event={"ID":"93253269-221c-4fa2-89e4-8f84eed7adfa","Type":"ContainerStarted","Data":"cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db"} Mar 20 11:44:36 crc kubenswrapper[4772]: I0320 11:44:36.795520 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" event={"ID":"93253269-221c-4fa2-89e4-8f84eed7adfa","Type":"ContainerStarted","Data":"a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a"} Mar 20 11:44:36 crc kubenswrapper[4772]: I0320 11:44:36.816547 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" podStartSLOduration=2.233917098 podStartE2EDuration="9.816524823s" podCreationTimestamp="2026-03-20 11:44:27 +0000 UTC" firstStartedPulling="2026-03-20 11:44:28.543691572 +0000 UTC m=+2954.634658057" lastFinishedPulling="2026-03-20 11:44:36.126299297 +0000 UTC m=+2962.217265782" observedRunningTime="2026-03-20 11:44:36.811213578 +0000 UTC m=+2962.902180063" watchObservedRunningTime="2026-03-20 11:44:36.816524823 +0000 UTC m=+2962.907491308" Mar 20 11:44:39 crc kubenswrapper[4772]: I0320 11:44:39.564577 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:44:39 crc kubenswrapper[4772]: I0320 11:44:39.564957 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:44:40 crc kubenswrapper[4772]: I0320 11:44:40.562613 4772 scope.go:117] "RemoveContainer" containerID="faf58613ba231ffd21859cedac83c6ec9658264f7b5daa5020df45c2b1191b66" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.154545 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h"] Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.158642 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.161183 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.166992 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h"] Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.177405 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.209782 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39a6d86-0911-4f59-9034-d6930a190525-config-volume\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.209916 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39a6d86-0911-4f59-9034-d6930a190525-secret-volume\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.209959 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4cs7\" (UniqueName: \"kubernetes.io/projected/b39a6d86-0911-4f59-9034-d6930a190525-kube-api-access-z4cs7\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.311549 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39a6d86-0911-4f59-9034-d6930a190525-secret-volume\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.311624 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z4cs7\" (UniqueName: \"kubernetes.io/projected/b39a6d86-0911-4f59-9034-d6930a190525-kube-api-access-z4cs7\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.311706 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39a6d86-0911-4f59-9034-d6930a190525-config-volume\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.313090 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39a6d86-0911-4f59-9034-d6930a190525-config-volume\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.327740 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39a6d86-0911-4f59-9034-d6930a190525-secret-volume\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.332934 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4cs7\" (UniqueName: \"kubernetes.io/projected/b39a6d86-0911-4f59-9034-d6930a190525-kube-api-access-z4cs7\") pod \"collect-profiles-29566785-9kr4h\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.493119 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:00 crc kubenswrapper[4772]: I0320 11:45:00.956951 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h"] Mar 20 11:45:01 crc kubenswrapper[4772]: E0320 11:45:01.433317 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39a6d86_0911_4f59_9034_d6930a190525.slice/crio-21afb6490a3978cf097d348e0da1c4fb3a5fce2dc1be45ee29a3a18c3fab2861.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb39a6d86_0911_4f59_9034_d6930a190525.slice/crio-conmon-21afb6490a3978cf097d348e0da1c4fb3a5fce2dc1be45ee29a3a18c3fab2861.scope\": RecentStats: unable to find data in memory cache]" Mar 20 11:45:01 crc kubenswrapper[4772]: I0320 11:45:01.985510 4772 generic.go:334] "Generic (PLEG): container finished" podID="b39a6d86-0911-4f59-9034-d6930a190525" containerID="21afb6490a3978cf097d348e0da1c4fb3a5fce2dc1be45ee29a3a18c3fab2861" exitCode=0 Mar 20 11:45:01 crc kubenswrapper[4772]: I0320 11:45:01.985594 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" event={"ID":"b39a6d86-0911-4f59-9034-d6930a190525","Type":"ContainerDied","Data":"21afb6490a3978cf097d348e0da1c4fb3a5fce2dc1be45ee29a3a18c3fab2861"} Mar 20 11:45:01 crc kubenswrapper[4772]: I0320 11:45:01.986114 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" event={"ID":"b39a6d86-0911-4f59-9034-d6930a190525","Type":"ContainerStarted","Data":"0a87105a2bca4061f84fa4f6e3b6087ea4409be7da5947ffe02026c1db91aa9e"} Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.262174 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.375585 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39a6d86-0911-4f59-9034-d6930a190525-secret-volume\") pod \"b39a6d86-0911-4f59-9034-d6930a190525\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.375662 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4cs7\" (UniqueName: \"kubernetes.io/projected/b39a6d86-0911-4f59-9034-d6930a190525-kube-api-access-z4cs7\") pod \"b39a6d86-0911-4f59-9034-d6930a190525\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.375765 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39a6d86-0911-4f59-9034-d6930a190525-config-volume\") pod \"b39a6d86-0911-4f59-9034-d6930a190525\" (UID: \"b39a6d86-0911-4f59-9034-d6930a190525\") " Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.376585 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b39a6d86-0911-4f59-9034-d6930a190525-config-volume" (OuterVolumeSpecName: "config-volume") pod "b39a6d86-0911-4f59-9034-d6930a190525" (UID: "b39a6d86-0911-4f59-9034-d6930a190525"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.383041 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b39a6d86-0911-4f59-9034-d6930a190525-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "b39a6d86-0911-4f59-9034-d6930a190525" (UID: "b39a6d86-0911-4f59-9034-d6930a190525"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.383781 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b39a6d86-0911-4f59-9034-d6930a190525-kube-api-access-z4cs7" (OuterVolumeSpecName: "kube-api-access-z4cs7") pod "b39a6d86-0911-4f59-9034-d6930a190525" (UID: "b39a6d86-0911-4f59-9034-d6930a190525"). InnerVolumeSpecName "kube-api-access-z4cs7". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.477255 4772 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/b39a6d86-0911-4f59-9034-d6930a190525-secret-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.477299 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z4cs7\" (UniqueName: \"kubernetes.io/projected/b39a6d86-0911-4f59-9034-d6930a190525-kube-api-access-z4cs7\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:03 crc kubenswrapper[4772]: I0320 11:45:03.477314 4772 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b39a6d86-0911-4f59-9034-d6930a190525-config-volume\") on node \"crc\" DevicePath \"\"" Mar 20 11:45:04 crc kubenswrapper[4772]: I0320 11:45:04.005279 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" event={"ID":"b39a6d86-0911-4f59-9034-d6930a190525","Type":"ContainerDied","Data":"0a87105a2bca4061f84fa4f6e3b6087ea4409be7da5947ffe02026c1db91aa9e"} Mar 20 11:45:04 crc kubenswrapper[4772]: I0320 11:45:04.005349 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a87105a2bca4061f84fa4f6e3b6087ea4409be7da5947ffe02026c1db91aa9e" Mar 20 11:45:04 crc kubenswrapper[4772]: I0320 11:45:04.005344 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29566785-9kr4h" Mar 20 11:45:04 crc kubenswrapper[4772]: I0320 11:45:04.334053 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf"] Mar 20 11:45:04 crc kubenswrapper[4772]: I0320 11:45:04.343744 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29566740-7ndqf"] Mar 20 11:45:04 crc kubenswrapper[4772]: I0320 11:45:04.650302 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3a950f69-dc46-4917-b204-c8195f937827" path="/var/lib/kubelet/pods/3a950f69-dc46-4917-b204-c8195f937827/volumes" Mar 20 11:45:09 crc kubenswrapper[4772]: I0320 11:45:09.564663 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:45:09 crc kubenswrapper[4772]: I0320 11:45:09.565044 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:45:23 crc kubenswrapper[4772]: I0320 11:45:23.143960 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-tkq8w_69d23241-cdf8-4417-bb47-d5541e49fb12/init/0.log" Mar 20 11:45:23 crc kubenswrapper[4772]: I0320 11:45:23.306927 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-tkq8w_69d23241-cdf8-4417-bb47-d5541e49fb12/dnsmasq-dns/0.log" Mar 20 11:45:23 crc kubenswrapper[4772]: I0320 11:45:23.440206 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack_dnsmasq-dns-78dd6ddcc-tkq8w_69d23241-cdf8-4417-bb47-d5541e49fb12/init/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.294437 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/util/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.527668 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/pull/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.559209 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/util/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.571307 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/pull/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.778913 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/util/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.779040 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/pull/0.log" Mar 20 11:45:38 crc kubenswrapper[4772]: I0320 11:45:38.780987 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_8dffe7a686db782c3456e5c21f538572abfaf8c7d84273b1e1c55b08588wsdl_26278da9-8afd-4b22-b53b-dc7334d50643/extract/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.021272 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_barbican-operator-controller-manager-59bc569d95-t4z2d_9980fd55-eca0-4c27-a021-59acc8681bfd/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.259373 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_designate-operator-controller-manager-588d4d986b-62t9t_3b1e9a1f-4847-46ed-9239-11b64f01ef55/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.352106 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_glance-operator-controller-manager-79df6bcc97-224ts_7ec0f9bb-bb24-4e95-8fb4-734eaee29058/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.428081 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_cinder-operator-controller-manager-8d58dc466-hlkjp_f6d09f24-ca68-486c-8fb6-e34e3172077a/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.464860 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_heat-operator-controller-manager-67dd5f86f5-92hmc_1854d930-56ec-441f-87d6-821b656cd195/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.564484 4772 patch_prober.go:28] interesting pod/machine-config-daemon-ltsw5 container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.564580 4772 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.564661 4772 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.565489 4772 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a"} pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.565574 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerName="machine-config-daemon" containerID="cri-o://ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" gracePeriod=600 Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.683548 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-8464cc45fb-kh9lg_ab4a9bd4-c1e1-453d-b586-5089696704fb/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.753767 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_infra-operator-controller-manager-669fff9c7c-7tx6m_a7308ded-ef41-499c-ae52-13d9e32b51e1/manager/0.log" Mar 20 11:45:39 crc kubenswrapper[4772]: I0320 11:45:39.869112 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ironic-operator-controller-manager-6f787dddc9-tcjvl_127b144a-f395-409d-a0a4-b79b60a60c1f/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: E0320 11:45:40.216453 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.241319 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_keystone-operator-controller-manager-768b96df4c-g8thw_52caed6a-6dcf-40d0-929c-948d0e421958/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.255587 4772 generic.go:334] "Generic (PLEG): container finished" podID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" exitCode=0 Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.255649 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerDied","Data":"ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a"} Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.255705 4772 scope.go:117] "RemoveContainer" containerID="d1c815217e6d78011cef6fae18a43310abbde6caa4920ff34c00cf94580234b9" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.256622 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:45:40 crc kubenswrapper[4772]: E0320 11:45:40.256915 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.277111 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_manila-operator-controller-manager-55f864c847-bvmfv_51140d29-3d51-44e1-a884-ffbad20bbb15/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.452116 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_mariadb-operator-controller-manager-67ccfc9778-qcwm6_953ed1af-7caa-4fe5-8443-3e5aa1caa77c/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.577555 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_neutron-operator-controller-manager-767865f676-wjc8s_e6248bcf-f076-4165-a9c7-0239c16e980d/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.670521 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_nova-operator-controller-manager-5d488d59fb-jxssq_546c5d3b-9054-45fb-9e45-95d01b61d012/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.849449 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_octavia-operator-controller-manager-5b9f45d989-7p7tf_5c181b83-da5d-454e-a061-c647bde19d5e/manager/0.log" Mar 20 11:45:40 crc kubenswrapper[4772]: I0320 11:45:40.926879 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-baremetal-operator-controller-manager-89d64c458-cnssm_a1c1c7a4-6840-4e52-b242-7de225eaac97/manager/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.158895 4772 scope.go:117] "RemoveContainer" containerID="f59ea5a510105fe60a16d016b3195a261750ad3d7095bbe1ac704d173bebe82c" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.173757 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-init-846ffbb776-fc7k4_571efd1c-abe7-4edc-a3d8-508b2ec30b37/operator/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.259531 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-controller-manager-6697dffbc-xp244_48ac539f-199f-49e4-8330-8956df8ea12f/manager/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.471666 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_openstack-operator-index-mm84p_775463b1-27bc-4fe6-81bf-81170d04d6bf/registry-server/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.487063 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_ovn-operator-controller-manager-884679f54-qdjbq_574f98d0-6e6f-43b0-a4d1-b13a5d123536/manager/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.700955 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_swift-operator-controller-manager-c674c5965-lz8q4_55e1d800-c5ed-4902-bfb3-b36e761a526b/manager/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.702430 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_placement-operator-controller-manager-5784578c99-jtzxg_32c94d79-871d-465b-afe6-e929661093c6/manager/0.log" Mar 20 11:45:41 crc kubenswrapper[4772]: I0320 11:45:41.967621 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_telemetry-operator-controller-manager-d6b694c5-nb5t9_d85081ab-dafc-467c-9445-9b7b221a56ee/manager/0.log" Mar 20 11:45:42 crc kubenswrapper[4772]: I0320 11:45:42.012154 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_test-operator-controller-manager-5c5cb9c4d7-s5gs7_e23924f7-44e6-4464-a6d6-240718124df8/manager/0.log" Mar 20 11:45:42 crc kubenswrapper[4772]: I0320 11:45:42.187968 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_watcher-operator-controller-manager-6c4d75f7f9-f488c_3f853c54-2e6e-41b9-b15f-3435b17477f2/manager/0.log" Mar 20 11:45:50 crc kubenswrapper[4772]: I0320 11:45:50.643172 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:45:50 crc kubenswrapper[4772]: E0320 11:45:50.644044 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.143548 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566786-jmwwq"] Mar 20 11:46:00 crc kubenswrapper[4772]: E0320 11:46:00.144898 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b39a6d86-0911-4f59-9034-d6930a190525" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.144918 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="b39a6d86-0911-4f59-9034-d6930a190525" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.145115 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="b39a6d86-0911-4f59-9034-d6930a190525" containerName="collect-profiles" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.145700 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.148187 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.148647 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.148957 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.157096 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-jmwwq"] Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.316096 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wptcb\" (UniqueName: \"kubernetes.io/projected/5f43a7c9-1e9f-4224-b43a-a972ab740b28-kube-api-access-wptcb\") pod \"auto-csr-approver-29566786-jmwwq\" (UID: \"5f43a7c9-1e9f-4224-b43a-a972ab740b28\") " pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.418210 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wptcb\" (UniqueName: \"kubernetes.io/projected/5f43a7c9-1e9f-4224-b43a-a972ab740b28-kube-api-access-wptcb\") pod \"auto-csr-approver-29566786-jmwwq\" (UID: \"5f43a7c9-1e9f-4224-b43a-a972ab740b28\") " pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.439736 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wptcb\" (UniqueName: \"kubernetes.io/projected/5f43a7c9-1e9f-4224-b43a-a972ab740b28-kube-api-access-wptcb\") pod \"auto-csr-approver-29566786-jmwwq\" (UID: \"5f43a7c9-1e9f-4224-b43a-a972ab740b28\") " pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.469158 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.873352 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566786-jmwwq"] Mar 20 11:46:00 crc kubenswrapper[4772]: I0320 11:46:00.884875 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:46:01 crc kubenswrapper[4772]: I0320 11:46:01.386484 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" event={"ID":"5f43a7c9-1e9f-4224-b43a-a972ab740b28","Type":"ContainerStarted","Data":"af24ef3e757b59a985535726d0882a1d931b670d3b23a0e4cd0cf3edd7a0c2b3"} Mar 20 11:46:01 crc kubenswrapper[4772]: I0320 11:46:01.901635 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-8l854_9f4d9edb-87ce-41e1-9cc0-aaf07230ec92/control-plane-machine-set-operator/0.log" Mar 20 11:46:02 crc kubenswrapper[4772]: I0320 11:46:02.054388 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lvstj_45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6/kube-rbac-proxy/0.log" Mar 20 11:46:02 crc kubenswrapper[4772]: I0320 11:46:02.097111 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-lvstj_45c4e1f3-f8fb-41c5-96a1-dc5f86f256e6/machine-api-operator/0.log" Mar 20 11:46:02 crc kubenswrapper[4772]: I0320 11:46:02.396057 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" event={"ID":"5f43a7c9-1e9f-4224-b43a-a972ab740b28","Type":"ContainerStarted","Data":"aa5098b753c1e29244ff5fd6e09a4bcef7e49bb861f410f5a6283f8c6f4914f2"} Mar 20 11:46:02 crc kubenswrapper[4772]: I0320 11:46:02.416140 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" podStartSLOduration=1.263860858 podStartE2EDuration="2.41610448s" podCreationTimestamp="2026-03-20 11:46:00 +0000 UTC" firstStartedPulling="2026-03-20 11:46:00.884600099 +0000 UTC m=+3046.975566584" lastFinishedPulling="2026-03-20 11:46:02.036843721 +0000 UTC m=+3048.127810206" observedRunningTime="2026-03-20 11:46:02.412222383 +0000 UTC m=+3048.503188868" watchObservedRunningTime="2026-03-20 11:46:02.41610448 +0000 UTC m=+3048.507070965" Mar 20 11:46:03 crc kubenswrapper[4772]: I0320 11:46:03.405052 4772 generic.go:334] "Generic (PLEG): container finished" podID="5f43a7c9-1e9f-4224-b43a-a972ab740b28" containerID="aa5098b753c1e29244ff5fd6e09a4bcef7e49bb861f410f5a6283f8c6f4914f2" exitCode=0 Mar 20 11:46:03 crc kubenswrapper[4772]: I0320 11:46:03.405578 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" event={"ID":"5f43a7c9-1e9f-4224-b43a-a972ab740b28","Type":"ContainerDied","Data":"aa5098b753c1e29244ff5fd6e09a4bcef7e49bb861f410f5a6283f8c6f4914f2"} Mar 20 11:46:03 crc kubenswrapper[4772]: I0320 11:46:03.641948 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:46:03 crc kubenswrapper[4772]: E0320 11:46:03.642206 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:46:04 crc kubenswrapper[4772]: I0320 11:46:04.679967 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:04 crc kubenswrapper[4772]: I0320 11:46:04.780230 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wptcb\" (UniqueName: \"kubernetes.io/projected/5f43a7c9-1e9f-4224-b43a-a972ab740b28-kube-api-access-wptcb\") pod \"5f43a7c9-1e9f-4224-b43a-a972ab740b28\" (UID: \"5f43a7c9-1e9f-4224-b43a-a972ab740b28\") " Mar 20 11:46:04 crc kubenswrapper[4772]: I0320 11:46:04.792011 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f43a7c9-1e9f-4224-b43a-a972ab740b28-kube-api-access-wptcb" (OuterVolumeSpecName: "kube-api-access-wptcb") pod "5f43a7c9-1e9f-4224-b43a-a972ab740b28" (UID: "5f43a7c9-1e9f-4224-b43a-a972ab740b28"). InnerVolumeSpecName "kube-api-access-wptcb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:46:04 crc kubenswrapper[4772]: I0320 11:46:04.882621 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wptcb\" (UniqueName: \"kubernetes.io/projected/5f43a7c9-1e9f-4224-b43a-a972ab740b28-kube-api-access-wptcb\") on node \"crc\" DevicePath \"\"" Mar 20 11:46:05 crc kubenswrapper[4772]: I0320 11:46:05.420360 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" event={"ID":"5f43a7c9-1e9f-4224-b43a-a972ab740b28","Type":"ContainerDied","Data":"af24ef3e757b59a985535726d0882a1d931b670d3b23a0e4cd0cf3edd7a0c2b3"} Mar 20 11:46:05 crc kubenswrapper[4772]: I0320 11:46:05.420400 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="af24ef3e757b59a985535726d0882a1d931b670d3b23a0e4cd0cf3edd7a0c2b3" Mar 20 11:46:05 crc kubenswrapper[4772]: I0320 11:46:05.420423 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566786-jmwwq" Mar 20 11:46:05 crc kubenswrapper[4772]: I0320 11:46:05.477065 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-pxflm"] Mar 20 11:46:05 crc kubenswrapper[4772]: I0320 11:46:05.482613 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566780-pxflm"] Mar 20 11:46:06 crc kubenswrapper[4772]: I0320 11:46:06.650986 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43fa2af6-cad9-4228-b5cf-d017b9b55c23" path="/var/lib/kubelet/pods/43fa2af6-cad9-4228-b5cf-d017b9b55c23/volumes" Mar 20 11:46:13 crc kubenswrapper[4772]: I0320 11:46:13.440234 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-858654f9db-c2s9c_e0cd75e7-19e1-431d-a863-c4ed52878e91/cert-manager-controller/0.log" Mar 20 11:46:13 crc kubenswrapper[4772]: I0320 11:46:13.584171 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-cainjector-cf98fcc89-b4mlm_693f7934-75ca-41bc-9bc1-20f7b9da436e/cert-manager-cainjector/0.log" Mar 20 11:46:13 crc kubenswrapper[4772]: I0320 11:46:13.648123 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/cert-manager_cert-manager-webhook-687f57d79b-rgz26_7ef174dc-35a4-4a44-a5a5-7f7d48284b14/cert-manager-webhook/0.log" Mar 20 11:46:14 crc kubenswrapper[4772]: I0320 11:46:14.645201 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:46:14 crc kubenswrapper[4772]: E0320 11:46:14.645455 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:46:25 crc kubenswrapper[4772]: I0320 11:46:25.694776 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-console-plugin-86f58fcf4-gp8fc_40b08b4b-058f-409a-9a32-d372878de5ad/nmstate-console-plugin/0.log" Mar 20 11:46:25 crc kubenswrapper[4772]: I0320 11:46:25.934408 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-handler-8zk97_097b90ac-6ee9-4601-9a4d-db33981b1878/nmstate-handler/0.log" Mar 20 11:46:25 crc kubenswrapper[4772]: I0320 11:46:25.960415 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-jx96w_986d86cc-279b-4511-95b8-10f80268aad4/kube-rbac-proxy/0.log" Mar 20 11:46:26 crc kubenswrapper[4772]: I0320 11:46:26.094189 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-metrics-9b8c8685d-jx96w_986d86cc-279b-4511-95b8-10f80268aad4/nmstate-metrics/0.log" Mar 20 11:46:26 crc kubenswrapper[4772]: I0320 11:46:26.243184 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-operator-796d4cfff4-f2f98_0c24388b-eb64-4fc4-a732-4dd168057b7a/nmstate-operator/0.log" Mar 20 11:46:26 crc kubenswrapper[4772]: I0320 11:46:26.316515 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-nmstate_nmstate-webhook-5f558f5558-cclbs_bc2c37a7-9314-4e8d-9e9e-3e98cb73aded/nmstate-webhook/0.log" Mar 20 11:46:26 crc kubenswrapper[4772]: I0320 11:46:26.641435 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:46:26 crc kubenswrapper[4772]: E0320 11:46:26.641694 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:46:41 crc kubenswrapper[4772]: I0320 11:46:41.204001 4772 scope.go:117] "RemoveContainer" containerID="79d73e80e31fe63222b9ca1af50bc59245f704d91c8e4c923149cc4e98215a95" Mar 20 11:46:41 crc kubenswrapper[4772]: I0320 11:46:41.641637 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:46:41 crc kubenswrapper[4772]: E0320 11:46:41.641872 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:46:52 crc kubenswrapper[4772]: I0320 11:46:52.688812 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2vhsh_1438038c-9bda-41f0-afb8-a16406defd25/kube-rbac-proxy/0.log" Mar 20 11:46:52 crc kubenswrapper[4772]: I0320 11:46:52.750119 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-7bb4cc7c98-2vhsh_1438038c-9bda-41f0-afb8-a16406defd25/controller/0.log" Mar 20 11:46:52 crc kubenswrapper[4772]: I0320 11:46:52.863977 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-frr-files/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.077426 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-frr-files/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.078820 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-reloader/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.095061 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-metrics/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.112992 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-reloader/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.275478 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-frr-files/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.298054 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-metrics/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.311814 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-reloader/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.344227 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-metrics/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.514343 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-frr-files/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.535058 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/controller/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.536314 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-reloader/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.576235 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/cp-metrics/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.697625 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/frr-metrics/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.794215 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/kube-rbac-proxy/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.805812 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/kube-rbac-proxy-frr/0.log" Mar 20 11:46:53 crc kubenswrapper[4772]: I0320 11:46:53.993795 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/reloader/0.log" Mar 20 11:46:54 crc kubenswrapper[4772]: I0320 11:46:54.092504 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-bcc4b6f68-jbcbn_22225148-3160-41cc-b52b-c294e4e51a57/frr-k8s-webhook-server/0.log" Mar 20 11:46:54 crc kubenswrapper[4772]: I0320 11:46:54.118734 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-g7pwp_b655b584-dc09-480e-8f60-9f7ff0608456/frr/0.log" Mar 20 11:46:54 crc kubenswrapper[4772]: I0320 11:46:54.251090 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-6f47d558c9-8f9x8_344e34de-aef3-4ac6-9492-e6d359b5966d/manager/0.log" Mar 20 11:46:54 crc kubenswrapper[4772]: I0320 11:46:54.305720 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-5bcf94d488-gf9n5_b117c6bf-28df-4a1b-bda6-b96dc51f6531/webhook-server/0.log" Mar 20 11:46:54 crc kubenswrapper[4772]: I0320 11:46:54.427036 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jk4jn_d4433c35-0c82-4feb-aedf-0c617ef9ff25/kube-rbac-proxy/0.log" Mar 20 11:46:54 crc kubenswrapper[4772]: I0320 11:46:54.598180 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-jk4jn_d4433c35-0c82-4feb-aedf-0c617ef9ff25/speaker/0.log" Mar 20 11:46:56 crc kubenswrapper[4772]: I0320 11:46:56.641824 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:46:56 crc kubenswrapper[4772]: E0320 11:46:56.642431 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.044672 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/util/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.205270 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/pull/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.246783 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/pull/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.274587 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/util/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.432728 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/pull/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.449880 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/util/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.470514 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_1d8741a795bd73341bdd61a6e59c08511cf9466dbb5fc4045ac2dde874qhqk2_6f03ad17-5689-47ee-87f9-cb5562711b9d/extract/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.608544 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/util/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.745957 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/util/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.771051 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/pull/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.790998 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/pull/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.971456 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/pull/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.972617 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/extract/0.log" Mar 20 11:47:06 crc kubenswrapper[4772]: I0320 11:47:06.979725 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_2d3ddce10053cc6867b5a0ce1614b30225f3a63fab79a72148165675c1c898x_860d09d3-69c4-44e1-9756-cbd62cdd94cc/util/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.134317 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/extract-utilities/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.317916 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/extract-content/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.331752 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/extract-utilities/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.332795 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/extract-content/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.507763 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/extract-content/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.527226 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/extract-utilities/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.747743 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/extract-utilities/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.947806 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/extract-utilities/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.981287 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/extract-content/0.log" Mar 20 11:47:07 crc kubenswrapper[4772]: I0320 11:47:07.992697 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/extract-content/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.178967 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-zrgrl_ec7931ba-a74f-4de4-9936-4eb143aaadf7/registry-server/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.194399 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/extract-utilities/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.212240 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/extract-content/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.427566 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-gshhb_a15bf049-cf58-4f6d-942c-9bdaac82f6df/marketplace-operator/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.523038 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-7fqv7_33660497-fea8-485a-a5ce-aebc8c8fe8a8/registry-server/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.630915 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/extract-utilities/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.807211 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/extract-utilities/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.807410 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/extract-content/0.log" Mar 20 11:47:08 crc kubenswrapper[4772]: I0320 11:47:08.809305 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/extract-content/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.007004 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/extract-utilities/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.033583 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/extract-content/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.146764 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-rpss5_b4a24fdf-708a-4252-9889-6d0c68ad8a5a/registry-server/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.256727 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/extract-utilities/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.375508 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/extract-utilities/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.378517 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/extract-content/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.422005 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/extract-content/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.558555 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/extract-utilities/0.log" Mar 20 11:47:09 crc kubenswrapper[4772]: I0320 11:47:09.577487 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/extract-content/0.log" Mar 20 11:47:10 crc kubenswrapper[4772]: I0320 11:47:10.060965 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-rv948_8003a3e3-923e-4962-a56c-7499ddb205ba/registry-server/0.log" Mar 20 11:47:10 crc kubenswrapper[4772]: I0320 11:47:10.642511 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:47:10 crc kubenswrapper[4772]: E0320 11:47:10.642784 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:47:21 crc kubenswrapper[4772]: I0320 11:47:21.642781 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:47:21 crc kubenswrapper[4772]: E0320 11:47:21.643498 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:47:32 crc kubenswrapper[4772]: I0320 11:47:32.646986 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:47:32 crc kubenswrapper[4772]: E0320 11:47:32.647726 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:47:44 crc kubenswrapper[4772]: I0320 11:47:44.647276 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:47:44 crc kubenswrapper[4772]: E0320 11:47:44.648180 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:47:56 crc kubenswrapper[4772]: I0320 11:47:56.643712 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:47:56 crc kubenswrapper[4772]: E0320 11:47:56.645184 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.161210 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566788-mlzds"] Mar 20 11:48:00 crc kubenswrapper[4772]: E0320 11:48:00.161700 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f43a7c9-1e9f-4224-b43a-a972ab740b28" containerName="oc" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.161713 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f43a7c9-1e9f-4224-b43a-a972ab740b28" containerName="oc" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.161838 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f43a7c9-1e9f-4224-b43a-a972ab740b28" containerName="oc" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.162267 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.163865 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.164788 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.165725 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.174878 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-mlzds"] Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.271309 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7txgh\" (UniqueName: \"kubernetes.io/projected/cddea62b-62e3-4285-88f3-f531265f0e1c-kube-api-access-7txgh\") pod \"auto-csr-approver-29566788-mlzds\" (UID: \"cddea62b-62e3-4285-88f3-f531265f0e1c\") " pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.372751 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7txgh\" (UniqueName: \"kubernetes.io/projected/cddea62b-62e3-4285-88f3-f531265f0e1c-kube-api-access-7txgh\") pod \"auto-csr-approver-29566788-mlzds\" (UID: \"cddea62b-62e3-4285-88f3-f531265f0e1c\") " pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.391303 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7txgh\" (UniqueName: \"kubernetes.io/projected/cddea62b-62e3-4285-88f3-f531265f0e1c-kube-api-access-7txgh\") pod \"auto-csr-approver-29566788-mlzds\" (UID: \"cddea62b-62e3-4285-88f3-f531265f0e1c\") " pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.525139 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:00 crc kubenswrapper[4772]: I0320 11:48:00.940753 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566788-mlzds"] Mar 20 11:48:01 crc kubenswrapper[4772]: I0320 11:48:01.229997 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-mlzds" event={"ID":"cddea62b-62e3-4285-88f3-f531265f0e1c","Type":"ContainerStarted","Data":"46fcaf79906aed446b776f30c16a08d692c3bf7408d0eb3a1106b2f8c48391b6"} Mar 20 11:48:03 crc kubenswrapper[4772]: I0320 11:48:03.250176 4772 generic.go:334] "Generic (PLEG): container finished" podID="cddea62b-62e3-4285-88f3-f531265f0e1c" containerID="bd8ae805e353ae1139d6564981e372bea64855b1c1f3bf4635d7cf126087519f" exitCode=0 Mar 20 11:48:03 crc kubenswrapper[4772]: I0320 11:48:03.250452 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-mlzds" event={"ID":"cddea62b-62e3-4285-88f3-f531265f0e1c","Type":"ContainerDied","Data":"bd8ae805e353ae1139d6564981e372bea64855b1c1f3bf4635d7cf126087519f"} Mar 20 11:48:04 crc kubenswrapper[4772]: I0320 11:48:04.540669 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:04 crc kubenswrapper[4772]: I0320 11:48:04.640656 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7txgh\" (UniqueName: \"kubernetes.io/projected/cddea62b-62e3-4285-88f3-f531265f0e1c-kube-api-access-7txgh\") pod \"cddea62b-62e3-4285-88f3-f531265f0e1c\" (UID: \"cddea62b-62e3-4285-88f3-f531265f0e1c\") " Mar 20 11:48:04 crc kubenswrapper[4772]: I0320 11:48:04.650022 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cddea62b-62e3-4285-88f3-f531265f0e1c-kube-api-access-7txgh" (OuterVolumeSpecName: "kube-api-access-7txgh") pod "cddea62b-62e3-4285-88f3-f531265f0e1c" (UID: "cddea62b-62e3-4285-88f3-f531265f0e1c"). InnerVolumeSpecName "kube-api-access-7txgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:48:04 crc kubenswrapper[4772]: I0320 11:48:04.742689 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7txgh\" (UniqueName: \"kubernetes.io/projected/cddea62b-62e3-4285-88f3-f531265f0e1c-kube-api-access-7txgh\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:04 crc kubenswrapper[4772]: E0320 11:48:04.833956 4772 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcddea62b_62e3_4285_88f3_f531265f0e1c.slice/crio-46fcaf79906aed446b776f30c16a08d692c3bf7408d0eb3a1106b2f8c48391b6\": RecentStats: unable to find data in memory cache]" Mar 20 11:48:05 crc kubenswrapper[4772]: I0320 11:48:05.274036 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566788-mlzds" event={"ID":"cddea62b-62e3-4285-88f3-f531265f0e1c","Type":"ContainerDied","Data":"46fcaf79906aed446b776f30c16a08d692c3bf7408d0eb3a1106b2f8c48391b6"} Mar 20 11:48:05 crc kubenswrapper[4772]: I0320 11:48:05.274063 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566788-mlzds" Mar 20 11:48:05 crc kubenswrapper[4772]: I0320 11:48:05.274087 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="46fcaf79906aed446b776f30c16a08d692c3bf7408d0eb3a1106b2f8c48391b6" Mar 20 11:48:05 crc kubenswrapper[4772]: I0320 11:48:05.664261 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-ck5fr"] Mar 20 11:48:05 crc kubenswrapper[4772]: I0320 11:48:05.671539 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566782-ck5fr"] Mar 20 11:48:06 crc kubenswrapper[4772]: I0320 11:48:06.649890 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2" path="/var/lib/kubelet/pods/e2e8c0d9-9b5c-481e-a21c-b43f12e74bf2/volumes" Mar 20 11:48:07 crc kubenswrapper[4772]: I0320 11:48:07.642285 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:48:07 crc kubenswrapper[4772]: E0320 11:48:07.642761 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:48:21 crc kubenswrapper[4772]: I0320 11:48:21.641869 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:48:21 crc kubenswrapper[4772]: E0320 11:48:21.642605 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:48:30 crc kubenswrapper[4772]: I0320 11:48:30.449164 4772 generic.go:334] "Generic (PLEG): container finished" podID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerID="a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a" exitCode=0 Mar 20 11:48:30 crc kubenswrapper[4772]: I0320 11:48:30.449252 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" event={"ID":"93253269-221c-4fa2-89e4-8f84eed7adfa","Type":"ContainerDied","Data":"a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a"} Mar 20 11:48:30 crc kubenswrapper[4772]: I0320 11:48:30.450465 4772 scope.go:117] "RemoveContainer" containerID="a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a" Mar 20 11:48:30 crc kubenswrapper[4772]: I0320 11:48:30.681898 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dsdnq_must-gather-k4dgg_93253269-221c-4fa2-89e4-8f84eed7adfa/gather/0.log" Mar 20 11:48:33 crc kubenswrapper[4772]: I0320 11:48:33.641939 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:48:33 crc kubenswrapper[4772]: E0320 11:48:33.642404 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:48:37 crc kubenswrapper[4772]: I0320 11:48:37.672294 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-dsdnq/must-gather-k4dgg"] Mar 20 11:48:37 crc kubenswrapper[4772]: I0320 11:48:37.678674 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-dsdnq/must-gather-k4dgg"] Mar 20 11:48:37 crc kubenswrapper[4772]: I0320 11:48:37.678795 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="copy" containerID="cri-o://cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db" gracePeriod=2 Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.047858 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dsdnq_must-gather-k4dgg_93253269-221c-4fa2-89e4-8f84eed7adfa/copy/0.log" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.048309 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.222670 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93253269-221c-4fa2-89e4-8f84eed7adfa-must-gather-output\") pod \"93253269-221c-4fa2-89e4-8f84eed7adfa\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.222786 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xx99p\" (UniqueName: \"kubernetes.io/projected/93253269-221c-4fa2-89e4-8f84eed7adfa-kube-api-access-xx99p\") pod \"93253269-221c-4fa2-89e4-8f84eed7adfa\" (UID: \"93253269-221c-4fa2-89e4-8f84eed7adfa\") " Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.234143 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/93253269-221c-4fa2-89e4-8f84eed7adfa-kube-api-access-xx99p" (OuterVolumeSpecName: "kube-api-access-xx99p") pod "93253269-221c-4fa2-89e4-8f84eed7adfa" (UID: "93253269-221c-4fa2-89e4-8f84eed7adfa"). InnerVolumeSpecName "kube-api-access-xx99p". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.313081 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/93253269-221c-4fa2-89e4-8f84eed7adfa-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "93253269-221c-4fa2-89e4-8f84eed7adfa" (UID: "93253269-221c-4fa2-89e4-8f84eed7adfa"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.325010 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xx99p\" (UniqueName: \"kubernetes.io/projected/93253269-221c-4fa2-89e4-8f84eed7adfa-kube-api-access-xx99p\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.325058 4772 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/93253269-221c-4fa2-89e4-8f84eed7adfa-must-gather-output\") on node \"crc\" DevicePath \"\"" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.507731 4772 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-dsdnq_must-gather-k4dgg_93253269-221c-4fa2-89e4-8f84eed7adfa/copy/0.log" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.508685 4772 generic.go:334] "Generic (PLEG): container finished" podID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerID="cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db" exitCode=143 Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.508758 4772 scope.go:117] "RemoveContainer" containerID="cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.508962 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-dsdnq/must-gather-k4dgg" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.536364 4772 scope.go:117] "RemoveContainer" containerID="a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.617580 4772 scope.go:117] "RemoveContainer" containerID="cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db" Mar 20 11:48:38 crc kubenswrapper[4772]: E0320 11:48:38.618174 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db\": container with ID starting with cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db not found: ID does not exist" containerID="cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.618244 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db"} err="failed to get container status \"cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db\": rpc error: code = NotFound desc = could not find container \"cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db\": container with ID starting with cc01394e21a8430f298a423b40e688fc5e2e420363b415d5561e1cbf65de45db not found: ID does not exist" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.618280 4772 scope.go:117] "RemoveContainer" containerID="a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a" Mar 20 11:48:38 crc kubenswrapper[4772]: E0320 11:48:38.618822 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a\": container with ID starting with a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a not found: ID does not exist" containerID="a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.618937 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a"} err="failed to get container status \"a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a\": rpc error: code = NotFound desc = could not find container \"a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a\": container with ID starting with a55c913b9f42cdd1683fba1f90df79b52d0687e72db0f7c02bddbdcf918ecb5a not found: ID does not exist" Mar 20 11:48:38 crc kubenswrapper[4772]: I0320 11:48:38.649748 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" path="/var/lib/kubelet/pods/93253269-221c-4fa2-89e4-8f84eed7adfa/volumes" Mar 20 11:48:41 crc kubenswrapper[4772]: I0320 11:48:41.307811 4772 scope.go:117] "RemoveContainer" containerID="69e0f11b865d7fca2ec3d1de0b747929b49ab7b4928af53fb02555a98c4a8a36" Mar 20 11:48:48 crc kubenswrapper[4772]: I0320 11:48:48.642158 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:48:48 crc kubenswrapper[4772]: E0320 11:48:48.642790 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:49:02 crc kubenswrapper[4772]: I0320 11:49:02.642567 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:49:02 crc kubenswrapper[4772]: E0320 11:49:02.643479 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:49:13 crc kubenswrapper[4772]: I0320 11:49:13.642928 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:49:13 crc kubenswrapper[4772]: E0320 11:49:13.644964 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:49:25 crc kubenswrapper[4772]: I0320 11:49:25.642152 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:49:25 crc kubenswrapper[4772]: E0320 11:49:25.642977 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:49:38 crc kubenswrapper[4772]: I0320 11:49:38.641700 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:49:38 crc kubenswrapper[4772]: E0320 11:49:38.642654 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:49:53 crc kubenswrapper[4772]: I0320 11:49:53.641938 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:49:53 crc kubenswrapper[4772]: E0320 11:49:53.642620 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.179901 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-infra/auto-csr-approver-29566790-zj5v7"] Mar 20 11:50:00 crc kubenswrapper[4772]: E0320 11:50:00.180895 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="copy" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.180912 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="copy" Mar 20 11:50:00 crc kubenswrapper[4772]: E0320 11:50:00.180927 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="gather" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.180935 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="gather" Mar 20 11:50:00 crc kubenswrapper[4772]: E0320 11:50:00.180952 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cddea62b-62e3-4285-88f3-f531265f0e1c" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.180963 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="cddea62b-62e3-4285-88f3-f531265f0e1c" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.181159 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="copy" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.181174 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="cddea62b-62e3-4285-88f3-f531265f0e1c" containerName="oc" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.181186 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="93253269-221c-4fa2-89e4-8f84eed7adfa" containerName="gather" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.181886 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.188868 4772 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-infra"/"csr-approver-sa-dockercfg-zlm59" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.188868 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"openshift-service-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.189313 4772 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-infra"/"kube-root-ca.crt" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.190253 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-zj5v7"] Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.287827 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-42lxz\" (UniqueName: \"kubernetes.io/projected/e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2-kube-api-access-42lxz\") pod \"auto-csr-approver-29566790-zj5v7\" (UID: \"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2\") " pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.389410 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-42lxz\" (UniqueName: \"kubernetes.io/projected/e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2-kube-api-access-42lxz\") pod \"auto-csr-approver-29566790-zj5v7\" (UID: \"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2\") " pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.408553 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-42lxz\" (UniqueName: \"kubernetes.io/projected/e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2-kube-api-access-42lxz\") pod \"auto-csr-approver-29566790-zj5v7\" (UID: \"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2\") " pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.506385 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:00 crc kubenswrapper[4772]: I0320 11:50:00.929041 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-infra/auto-csr-approver-29566790-zj5v7"] Mar 20 11:50:01 crc kubenswrapper[4772]: I0320 11:50:01.072895 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" event={"ID":"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2","Type":"ContainerStarted","Data":"38ccd0b9b1bf7277c2b8591cb8fee86c19121b7d2fa77d397ac86a3631843c98"} Mar 20 11:50:02 crc kubenswrapper[4772]: I0320 11:50:02.080477 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" event={"ID":"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2","Type":"ContainerStarted","Data":"281ea3e7f3c0dba85636b1acf3e917c2ebd783308a9ad2387ad90d47983d880d"} Mar 20 11:50:02 crc kubenswrapper[4772]: I0320 11:50:02.101163 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" podStartSLOduration=1.280423692 podStartE2EDuration="2.101140918s" podCreationTimestamp="2026-03-20 11:50:00 +0000 UTC" firstStartedPulling="2026-03-20 11:50:00.936809909 +0000 UTC m=+3287.027776414" lastFinishedPulling="2026-03-20 11:50:01.757527155 +0000 UTC m=+3287.848493640" observedRunningTime="2026-03-20 11:50:02.0926971 +0000 UTC m=+3288.183663615" watchObservedRunningTime="2026-03-20 11:50:02.101140918 +0000 UTC m=+3288.192107403" Mar 20 11:50:03 crc kubenswrapper[4772]: I0320 11:50:03.092022 4772 generic.go:334] "Generic (PLEG): container finished" podID="e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2" containerID="281ea3e7f3c0dba85636b1acf3e917c2ebd783308a9ad2387ad90d47983d880d" exitCode=0 Mar 20 11:50:03 crc kubenswrapper[4772]: I0320 11:50:03.092074 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" event={"ID":"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2","Type":"ContainerDied","Data":"281ea3e7f3c0dba85636b1acf3e917c2ebd783308a9ad2387ad90d47983d880d"} Mar 20 11:50:04 crc kubenswrapper[4772]: I0320 11:50:04.377481 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:04 crc kubenswrapper[4772]: I0320 11:50:04.543086 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-42lxz\" (UniqueName: \"kubernetes.io/projected/e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2-kube-api-access-42lxz\") pod \"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2\" (UID: \"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2\") " Mar 20 11:50:04 crc kubenswrapper[4772]: I0320 11:50:04.548145 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2-kube-api-access-42lxz" (OuterVolumeSpecName: "kube-api-access-42lxz") pod "e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2" (UID: "e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2"). InnerVolumeSpecName "kube-api-access-42lxz". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:50:04 crc kubenswrapper[4772]: I0320 11:50:04.645224 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-42lxz\" (UniqueName: \"kubernetes.io/projected/e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2-kube-api-access-42lxz\") on node \"crc\" DevicePath \"\"" Mar 20 11:50:05 crc kubenswrapper[4772]: I0320 11:50:05.108350 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" event={"ID":"e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2","Type":"ContainerDied","Data":"38ccd0b9b1bf7277c2b8591cb8fee86c19121b7d2fa77d397ac86a3631843c98"} Mar 20 11:50:05 crc kubenswrapper[4772]: I0320 11:50:05.108748 4772 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="38ccd0b9b1bf7277c2b8591cb8fee86c19121b7d2fa77d397ac86a3631843c98" Mar 20 11:50:05 crc kubenswrapper[4772]: I0320 11:50:05.108448 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-infra/auto-csr-approver-29566790-zj5v7" Mar 20 11:50:05 crc kubenswrapper[4772]: I0320 11:50:05.175197 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-xshgf"] Mar 20 11:50:05 crc kubenswrapper[4772]: I0320 11:50:05.182209 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-infra/auto-csr-approver-29566784-xshgf"] Mar 20 11:50:06 crc kubenswrapper[4772]: I0320 11:50:06.651544 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68262080-18cc-4d09-b2ef-2f7629f952ae" path="/var/lib/kubelet/pods/68262080-18cc-4d09-b2ef-2f7629f952ae/volumes" Mar 20 11:50:08 crc kubenswrapper[4772]: I0320 11:50:08.642342 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:50:08 crc kubenswrapper[4772]: E0320 11:50:08.642915 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:50:20 crc kubenswrapper[4772]: I0320 11:50:20.642006 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:50:20 crc kubenswrapper[4772]: E0320 11:50:20.643897 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:50:35 crc kubenswrapper[4772]: I0320 11:50:35.642292 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:50:35 crc kubenswrapper[4772]: E0320 11:50:35.643000 4772 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-ltsw5_openshift-machine-config-operator(ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e)\"" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" podUID="ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e" Mar 20 11:50:41 crc kubenswrapper[4772]: I0320 11:50:41.399300 4772 scope.go:117] "RemoveContainer" containerID="188f12a465b3dae06b4b37a58a01defaaefaa21a8d26f61b3d4349df09c58e97" Mar 20 11:50:50 crc kubenswrapper[4772]: I0320 11:50:50.642503 4772 scope.go:117] "RemoveContainer" containerID="ff128fe803c3b7510640cc9231477636770b0d6acfb28ac87dff9665112ca06a" Mar 20 11:50:51 crc kubenswrapper[4772]: I0320 11:50:51.446662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-ltsw5" event={"ID":"ea07e2c1-7a61-4afd-97b4-22f8f9dc5c3e","Type":"ContainerStarted","Data":"38aee9fa6df4064d34c0e993458e29a77a24c353e358c9caaa36c49dc70fe51e"} Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.058681 4772 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bwpcq"] Mar 20 11:50:58 crc kubenswrapper[4772]: E0320 11:50:58.059642 4772 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2" containerName="oc" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.059659 4772 state_mem.go:107] "Deleted CPUSet assignment" podUID="e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2" containerName="oc" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.059826 4772 memory_manager.go:354] "RemoveStaleState removing state" podUID="e6ee4af2-2458-4e41-8b8b-8daa0cc2bbb2" containerName="oc" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.060981 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.098346 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j79fb\" (UniqueName: \"kubernetes.io/projected/d65afa1d-fe10-4b41-9284-38aff53bf87d-kube-api-access-j79fb\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.098431 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-catalog-content\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.098467 4772 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-utilities\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.138896 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwpcq"] Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.199730 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-catalog-content\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.199787 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-utilities\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.199885 4772 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j79fb\" (UniqueName: \"kubernetes.io/projected/d65afa1d-fe10-4b41-9284-38aff53bf87d-kube-api-access-j79fb\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.200196 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-catalog-content\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.200390 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-utilities\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.230864 4772 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j79fb\" (UniqueName: \"kubernetes.io/projected/d65afa1d-fe10-4b41-9284-38aff53bf87d-kube-api-access-j79fb\") pod \"redhat-operators-bwpcq\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.382451 4772 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:50:58 crc kubenswrapper[4772]: I0320 11:50:58.844300 4772 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bwpcq"] Mar 20 11:50:59 crc kubenswrapper[4772]: I0320 11:50:59.512560 4772 generic.go:334] "Generic (PLEG): container finished" podID="d65afa1d-fe10-4b41-9284-38aff53bf87d" containerID="c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0" exitCode=0 Mar 20 11:50:59 crc kubenswrapper[4772]: I0320 11:50:59.512646 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerDied","Data":"c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0"} Mar 20 11:50:59 crc kubenswrapper[4772]: I0320 11:50:59.512897 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerStarted","Data":"e0fdb58c538e8e709e36370048bbe4caa208534eef72979a4933d9f9f6d109cf"} Mar 20 11:51:00 crc kubenswrapper[4772]: I0320 11:51:00.522530 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerStarted","Data":"c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a"} Mar 20 11:51:01 crc kubenswrapper[4772]: I0320 11:51:01.531970 4772 generic.go:334] "Generic (PLEG): container finished" podID="d65afa1d-fe10-4b41-9284-38aff53bf87d" containerID="c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a" exitCode=0 Mar 20 11:51:01 crc kubenswrapper[4772]: I0320 11:51:01.532098 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerDied","Data":"c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a"} Mar 20 11:51:01 crc kubenswrapper[4772]: I0320 11:51:01.534483 4772 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Mar 20 11:51:02 crc kubenswrapper[4772]: I0320 11:51:02.545633 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerStarted","Data":"71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f"} Mar 20 11:51:02 crc kubenswrapper[4772]: I0320 11:51:02.577834 4772 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bwpcq" podStartSLOduration=2.191749252 podStartE2EDuration="4.577801983s" podCreationTimestamp="2026-03-20 11:50:58 +0000 UTC" firstStartedPulling="2026-03-20 11:50:59.514267929 +0000 UTC m=+3345.605234414" lastFinishedPulling="2026-03-20 11:51:01.90032066 +0000 UTC m=+3347.991287145" observedRunningTime="2026-03-20 11:51:02.568681326 +0000 UTC m=+3348.659647831" watchObservedRunningTime="2026-03-20 11:51:02.577801983 +0000 UTC m=+3348.668768468" Mar 20 11:51:08 crc kubenswrapper[4772]: I0320 11:51:08.383001 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:51:08 crc kubenswrapper[4772]: I0320 11:51:08.383367 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:51:08 crc kubenswrapper[4772]: I0320 11:51:08.426874 4772 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:51:08 crc kubenswrapper[4772]: I0320 11:51:08.621450 4772 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:51:08 crc kubenswrapper[4772]: I0320 11:51:08.668109 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bwpcq"] Mar 20 11:51:10 crc kubenswrapper[4772]: I0320 11:51:10.601894 4772 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bwpcq" podUID="d65afa1d-fe10-4b41-9284-38aff53bf87d" containerName="registry-server" containerID="cri-o://71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f" gracePeriod=2 Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.102740 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.113868 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-catalog-content\") pod \"d65afa1d-fe10-4b41-9284-38aff53bf87d\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.114045 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j79fb\" (UniqueName: \"kubernetes.io/projected/d65afa1d-fe10-4b41-9284-38aff53bf87d-kube-api-access-j79fb\") pod \"d65afa1d-fe10-4b41-9284-38aff53bf87d\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.114200 4772 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-utilities\") pod \"d65afa1d-fe10-4b41-9284-38aff53bf87d\" (UID: \"d65afa1d-fe10-4b41-9284-38aff53bf87d\") " Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.115948 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-utilities" (OuterVolumeSpecName: "utilities") pod "d65afa1d-fe10-4b41-9284-38aff53bf87d" (UID: "d65afa1d-fe10-4b41-9284-38aff53bf87d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.125643 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d65afa1d-fe10-4b41-9284-38aff53bf87d-kube-api-access-j79fb" (OuterVolumeSpecName: "kube-api-access-j79fb") pod "d65afa1d-fe10-4b41-9284-38aff53bf87d" (UID: "d65afa1d-fe10-4b41-9284-38aff53bf87d"). InnerVolumeSpecName "kube-api-access-j79fb". PluginName "kubernetes.io/projected", VolumeGidValue "" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.216997 4772 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j79fb\" (UniqueName: \"kubernetes.io/projected/d65afa1d-fe10-4b41-9284-38aff53bf87d-kube-api-access-j79fb\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.217041 4772 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-utilities\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.286696 4772 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d65afa1d-fe10-4b41-9284-38aff53bf87d" (UID: "d65afa1d-fe10-4b41-9284-38aff53bf87d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.319311 4772 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d65afa1d-fe10-4b41-9284-38aff53bf87d-catalog-content\") on node \"crc\" DevicePath \"\"" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.617597 4772 generic.go:334] "Generic (PLEG): container finished" podID="d65afa1d-fe10-4b41-9284-38aff53bf87d" containerID="71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f" exitCode=0 Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.617662 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerDied","Data":"71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f"} Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.617697 4772 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bwpcq" event={"ID":"d65afa1d-fe10-4b41-9284-38aff53bf87d","Type":"ContainerDied","Data":"e0fdb58c538e8e709e36370048bbe4caa208534eef72979a4933d9f9f6d109cf"} Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.617717 4772 scope.go:117] "RemoveContainer" containerID="71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.617713 4772 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bwpcq" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.655420 4772 scope.go:117] "RemoveContainer" containerID="c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.660227 4772 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bwpcq"] Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.661132 4772 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bwpcq"] Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.675640 4772 scope.go:117] "RemoveContainer" containerID="c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.696742 4772 scope.go:117] "RemoveContainer" containerID="71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f" Mar 20 11:51:12 crc kubenswrapper[4772]: E0320 11:51:12.697868 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f\": container with ID starting with 71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f not found: ID does not exist" containerID="71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.697911 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f"} err="failed to get container status \"71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f\": rpc error: code = NotFound desc = could not find container \"71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f\": container with ID starting with 71016fd39cb9f58b671a29eae1e33841209ebba013ac1f99e276090f8795964f not found: ID does not exist" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.697956 4772 scope.go:117] "RemoveContainer" containerID="c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a" Mar 20 11:51:12 crc kubenswrapper[4772]: E0320 11:51:12.698313 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a\": container with ID starting with c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a not found: ID does not exist" containerID="c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.698371 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a"} err="failed to get container status \"c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a\": rpc error: code = NotFound desc = could not find container \"c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a\": container with ID starting with c1811104ec3e5f9ae16e3347f1b751132d35728c0f17ad06898b9208d3436c8a not found: ID does not exist" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.698404 4772 scope.go:117] "RemoveContainer" containerID="c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0" Mar 20 11:51:12 crc kubenswrapper[4772]: E0320 11:51:12.698741 4772 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0\": container with ID starting with c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0 not found: ID does not exist" containerID="c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0" Mar 20 11:51:12 crc kubenswrapper[4772]: I0320 11:51:12.698771 4772 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0"} err="failed to get container status \"c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0\": rpc error: code = NotFound desc = could not find container \"c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0\": container with ID starting with c8363024786b0b9762d13f2b9a5ee595976e324ffbf659e3a39d83744aa4b3d0 not found: ID does not exist" Mar 20 11:51:14 crc kubenswrapper[4772]: I0320 11:51:14.650720 4772 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d65afa1d-fe10-4b41-9284-38aff53bf87d" path="/var/lib/kubelet/pods/d65afa1d-fe10-4b41-9284-38aff53bf87d/volumes"